Environment

Team hardware

The team machines will have either 8 GB or 16 GB of RAM, of which at least 6 GB will be available for the virtual machine that provides the contest environment. The machines will have one of the following CPUs:

  • Intel i3-4130 (2 cores, 4 threads, 3.4 Ghz)
  • Intel i5-4590 (4 cores, 4 threads, 2.93 Ghz)
  • Intel i7-2600 (4 cores, 8 threads, 3.4 Ghz)
  • Intel i7-6700T (4 cores, 8 threads, 2.8 Ghz)
  • Intel i7-7700T (4 cores, 8 threads, 2.9 Ghz)

The allocation of machines to teams will be drawn at random.

The team machines will be equipped either with an AZERTY keyboard or with a QWERTY keyboard, depending on the information provided by the team at registration (see the registration page for details about how to specify this). Subject to availability, the AZERTY keyboards will be one of the following two models (one, two, allocated at random) and the QWERTY keyboards will be like this. Teams are not permitted to bring their own keyboards, but may put stickers on the keyboard if they wish; see regulations.

Of course, no matter the physical layout of the keyboards, it is always possible to reconfigure them in software to a different layout (that does not match what is printed on the keyboard).

Team software

The software configuration will be as follows:

  • OS
  • Desktop
    • GNOME 3
  • Editors
    • vi/vim
    • gvim
    • emacs
    • gedit
    • Kate
    • geany
    • atom
  • Languages
    • Java
      • openjdk version "10.0.2" 2018-07-17
      • OpenJDK Runtime Environment (build 10.0.2+13-Debian-2)
      • OpenJDK 64-Bit Server VM (build 10.0.2+13-Debian-2, mixed mode)
    • C
      • gcc (Debian 7.2.0-11) 7.2.0
    • C++
      • g++ (Debian 7.2.0-11) 7.2.0
    • Python
      • Python 2.7.8 (2.4.0+dfsg-3, Dec 20 2014, 13:30:46) [PyPy 2.4.0 with GCC 4.9.2] list of installed modules
      • Python 3.4.2 list of installed modules
      • (Please note that some additional Python modules will be available in the contest environment (e.g., those related to graphical user interfaces) but will not be provided in the judging environment. The lists above indicate what will be provided in the judging environment.)
  • IDEs:
    • IntelliJ IDEA Community Edition 2018.2.5, Build #IC-182.4892.20
    • PyCharm Community 2018.2.4 (Community Edition), Build #PC-182.4505.26
    • Eclipse IDE for Java Developers (Oxygen.3a Release 4.7.3a, build id 20180405-1200)
    • Eclipse IDE for C/C++ Developers (Oxygen.3a Release 4.7.3a, build id 20180405-1200)
    • Apache NetBeans IDE 9.0 (Build incubator-netbeans-release-334-on-20180708)
    • Code::Blocks 16.01+dfsg-2~bpo8+1
    • Spyder 2.3.1
  • Debuggers: gdb, valgrind, ddd
  • Browsers: Firefox, Chromium

The computers will not have any network access except to the judging system and to the reference documentation provided.

We provide an Open Virtualization Archive image of the contest environment, to be used with virtualization software such as Virtualbox. The contestant account has login "swerc" and password "swerc", and is opened automatically on startup. The administrative account has login "swercadmin" and password "swercadmin" and has sudoer rights. Unlike the actual image, no network restrictions are enforced in this image.

Compilation flags

The judging system will compile submissions with the following options. Each exists as an alias on the team machines:

Language Implementation Command Alias
C gcc gcc -g -O2 -Wall -Wextra -std=gnu11 -static "$@" -lm mygcc
C++ g++ g++ -g -O2 -Wall -Wextra -std=gnu++14 -static "$@" -lm myg++
Java OpenJDK javac "$@" myjavac
Python 2 PyPy pypy "$@" mypython2
Python 3 CPython python3 "$@" mypython3

Java programs will be executed with a stack size of 8 MB (i.e., -Xss 8m) and a memory limit equal to the memory limit for the problem (i.e., -Xmx).

Language features

The following language features are not permitted in any of the contest languages:

  • inline assembly code
  • threads
  • file I/O
  • file management
  • device management
  • interprocess communication
  • forking and execution of external commands

More generally, any system call other than memory management, reading from the standard input, writing to the standard output, and exception management, is forbidden.

Submissions using any of these features will be rejected, either automatically by the judging system, or manually by the judges.

Judging hardware

Submissions will be judged on three machines, each having an Intel Xeon E5-2660 CPU (2.6 Ghz, 20 cores) and having between 256 GB and 512 GB of RAM.

Judging software

The software configuration for judge machines is a Debian Linux Jessie 64bit virtual machine with exactly the same software version as the team software above.

The contest control system that will be used is DOMjudge. We have a public Domjudge instance with the SWERC'17 problems that contestants can use to prepare; this should be very similar to the judging system that we will use at SWERC.

Submissions will be evaluated automatically unless something unexpected happens (system crash, error in a test case, etc.).

Verdicts are given in the following order:

  • Too-late: This verdict is given if the submission was made after the end of the contest. This verdict does not lead to a time penalty.
  • Compiler-error: This verdict is given if the contest control system failed to compile the submission. Warnings are not treated as errors. This verdict does not lead to a time penalty. Details of compilation errors will not be shown by the judging system. If your code compiles correctly in the client environment but leads to a Compiler-error verdict on the judge, contestants should submit a clarification request to the judges.
  • The submission is then evaluated on several secret test cases in some fixed order. Each test case is independent, i.e., the time limits, memory limits, etc., apply to each individual test case. If the submission fails to process correctly a test case, then evaluation stops and an error verdict is returned (see next list), and a penalty of 20 minutes is added for the problem (which are only counted against the team if the problem is eventually solved). If a submission is rejected, no information will be provided about the number of the test case(s) where the submission failed.
  • Correct: If the evaluation process completes and the submission has returned the correct answer on each secret test case following all requirements, then the submission is accepted. Note that this verdict may still be overridden manually by judges.

The following errors can be raised on a submission. The verdict returned is the one for the first test case where the submission has failed. The verdicts are as follows, in order of priority:

  • Error verdicts where execution did not complete: the verdict returned will be the one of the first error amongst:
  • Output-limit: The submission produced too much output. (The precise output limit is not specified.)
  • Run-error: The submission failed to execute properly on a test case (segmentation fault, divide by zero, exceeding the memory limit, etc.). Details of the error are not shown.
  • Timelimit: The submission exceeded the time limit on one test case, which may indicate that your code went into an infinite loop or that the approach is not efficient enough. (The precise time limit is not specified, but the order of magnitude is of a few seconds.)
  • Error verdicts where execution completed but did not produce the correct answer: the verdict returned will be the first matching verdict amongst:
  • No-output: There is at least one test case where the submission executed correcly but did not produce any result; and on other test cases, it executed properly and produced the correct output.
  • Wrong-answer: The submission executed properly on a test case but it did not produce the correct answer. Details are not specified.

Note that there is no "presentation-error" verdict: errors in output format are treated as wrong answers.

Problem set

The problem set will be provided on paper (one copy per contestant), and also in PDF files on the judge system (allowing you to copy and paste the sample inputs and outputs). We may also provide an archive of the sample inputs and outputs to be used directly.

Making a submission

Once you have written code to solve a problem, you can submit it to the contest control system for evaluation. Authentication to the contest control system is done by IP under normal circumstances, so you do not need to log in. You can submit using the web interface, by opening a web browser and using the provided links/bookmarks, or you can submit by command line using the submit program. (Make sure that the file that you wish to submit has the correct name, because submit uses this to determine automatically for which problem you are submitting.)

Asking questions

If a contestant has an issue with the problem set (e.g., it is ambiguous or incorrect), they can ask a question to the judges using the clarification request mechanism of DomJudge. Usually, the judges will either decline to answer or issue a general clarification to all teams to clarify the meaning of the problem set or fix the error.

If a contestant has a technical issue with the team workstation (hardware malfunction, computer crash, etc.), they should ask the volunteer in their room for help.

Of course, neither the judges nor the volunteers will answer any requests for technical support, e.g., debugging your code, understanding a compiler error, etc.

Location and rooms

The contest will be held in Télécom ParisTech, 46 rue Barrault, 75013 Paris, in rooms located approximately here. Between six and eight different rooms will be used. The rooms are all located at the second floor of the building, and have windows (the height above ground level is roughly 5 to 10 meters).

Printing

During the contest, teams will have the possibility to request printouts, e.g., of their code. These printouts will be delivered by volunteers. Printouts can be requested within reason, i.e., as long as the requested quantities do not negatively impact contest operations.

You can print using the web interface or by command line using the printout program. (Make sure that the file that you wish to submit has the correct extension, because submit uses this to determine automatically which syntax highlighting to apply.)

Requests for additional software

It will not be possible to request installation of additional software during SWERC, except in case of unforeseen problems or when the software is very simple to install (e.g., a simple apt-get install command). As of November 1st, the deadline for additional software requests has passed.

Europe sponsor

Huawei

Gold sponsors

Lokad SYSTRAN

Silver sponsors

Almerys Facebook Région Île de France Télécom ParisTech

Bronze sponsors

Criteo Labs Inria

World Finals sponsor

JetBrains