The contest

Presentation of the contest

For the 2020–2021 edition, the SWERC contest happened online.

SWERC has essentially the same structure as the ICPC contest, except that it takes place at the level of Southwestern Europe. The contest is fought between teams, each of which represents an institution from the region, i.e., a university, an engineering school, etc. Institutions choose the teams that they wish to send to SWERC; they cannot send more than a few teams, so they usually run an internal selection if too many students are interested.

Each team comprises three students from the institution. Each student must satisfy the eligibility requirements (see also team composition). The teams of each institution are accompanied by a coach, who serves as the person of contact for the institution: it is usually an older student or a member of faculty. The coach is the person who registers the teams (see registration), and usually organises their travel, hosting, etc.

This year, the contest will be held online. It will consist of an opening ceremony on Saturday morning, a mock contest on Saturday afternoon followed by a Q&A, the contest on Sunday morning, and an awards ceremony on Sunday afternoon (see the schedule). Contestants will participate from their own machines, with Internet access, on an online judging system.

At the start of the contest, the problems will be made available on the judging system. Contestants have five hours to solve as many of these problems as possible. Each problem statement is written in English and describes a real-life situation that can be solved with algorithmic skill, e.g., finding the shortest route from one point to another in a city, computing the area of a polygon, etc. To solve the problem, the team must write a computer program that can give the correct answer to the challenge, using one of the official programming languages of the contest. The problem statement specifies the input and output format, some limits on the problem parameters (instance size, etc.), and limits on memory, running time, etc.

When a team thinks that they have solved a problem, they submit their program to an automated judging system for evaluation. The judging system tests the program on a secret set of inputs and outputs, and verifies that the program always returns the correct answer, without crashing, and respecting the running time and memory limitations. If the program passes the tests, the judge gives an ACCEPTED verdict, and the team has successfully solved the problem. Otherwise, the judge gives a verdict with some information about the issue (e.g., WRONG ANSWER when the program returns an incorrect answer on some test case, TIME LIMIT EXCEEDED if the program did not complete in the required time, RUNTIME ERROR if it crashed, etc.). In this case, the contestants have to identify the problem and fix it: they are not provided with more information about the cause of the error or the test case that triggers the error.

During the contest, the team members work together, but they cannot be helped by their coaches or by anyone from the outside; also, they cannot communicate with other teams. The problems must be solved in the programming languages of the contest, i.e., C, C++, Java, Kotlin, Python 3, and OCaml. The code is compiled and evaluated on the judging system: see the environment information for details of the versions used.

At the end of the contest, the teams are ranked by the number of problems that they have solved, and ties are broken based on the time that they took to solve the problems (see scoring). This is how the final standings for the contest are derived; see for instance the standings from SWERC 2017. The best teams receive medals: the top 2 teams receive gold medals, the next 4 teams receive silver medals, and the next 8 teams receive bronze medals. Further, the highest-ranking teams (usually at least the first two teams) are selected to participate to the ICPC World Finals, where they compete against teams from all over the world.

The organization committee of SWERC changes every few years, and usually comprises faculty from the host institution as well as volunteers. See organizers for details about the current organizers. SWERC 2017, 2018, and 2019–2020 were held at Télécom Paris, and before this SWERC 2016 was held in University of Porto; see past editions for details about previous editions.

Opinion poll

An opinion poll about SWERC'20-21 was conducted after the event. We received 28 answers. The aggregated results are available here.

Reports

The SWERC contest is managed by a French nonprofit association called CPCI ("comité de promotion des concours informatiques", RNA number W751238568). After SWERC 2020--2021, we have prepared an activity report and a financial report for the activities of this nonprofit. The activity report gives some details of how the event was organized and run, and the financial report gives a breakdown of all income and all expenses in preparing the contest. These documents are available here:

You can also refer to the same documents for SWERC'19.

Preparing for the contest

To perform well at SWERC, it is important to train in advance. Teams should train together to have some experience of working together, collaborating to solve problems, sharing the computer between the team members, and allocating the time between problems. (In particular, problems that are not ACCEPTED do not contribute to the score, so it is important to finish the problems that you start. Further, the time scoring rules encourage teams to solve the easy problems first.)

Team members should also become acquainted with the format of the problems and with the algorithmic concepts that usually occur in problems; and of course they should have some practice with one of the official programming languages, to write code quickly and without errors, and debug it efficiently if there is a problem.

The best way to train as a team is to get together with your team members and with one computer for a fixed duration (five hours or a bit less), and try your luck on some problems from previous years without using the Internet. There are websites that collect such problems, and also provide a judging system which is similar to the one used in the contest. A large collection of ICPC problems is offered on websites such as ICPC Live Archive and UVa Online Judge; in particular the problems of SWERC 2017 can be found on ICPC Live Archive and on UVa Online Judge. To use the judging system on these websites, you need to create an account (on UVa Online Judge, on ICPC Live Archive). There are other websites to train for programming contests in general, such as Codeforces, Topcoder, and France-IOI (in French).

There is also plenty of documentation online about the format of the ICPC contest and how to train for it. Here are a few links: a Quora thread, a GeeksforGeeks page.

The best teams that participate to ICPC often prepare a team reference document (aka "notebook") with the code for common algorithms, to be used during the contest. See the regulations page for details about what the notebook can contain. An example of notebook is the one from Stanford University. Please note that we only link this notebook for illustrative purposes and do not otherwise endorse its contents.

Training camps

Several institutions are organizing training camps for ICPC and SWERC. If we are informed about them, we will list them here. If you would like your event to be listed here, please write to the webmaster.

Other programming contests

We list here some other programming contests than ICPC, with some brief information about their format (individual participation or team participation, participation over the Internet or on-site participation, age restrictions etc.)

This list is not moderated and we do not especially endorse the events that it contains. If you would like your event to be listed here, please write to the webmaster.

Gold sponsor

Jane Street

Bronze sponsors

Jump Trading Sopra Steria

Institutional sponsors

Région Île de France

ICPC Global sponsors

Huawei JetBrains IBM

Lisbon local sponsors

Critical TechWorks Unbabel