Environment

Updates

This page has been updated on November 16, and the details posted here are supposed to be final. Here are the main changes so far, relative to the version that was originally announced:

  • Installation of some packages from Debian testing, in particular gcc version 7. You can refer to the new list of installed packages, or the new list of manually installed packages, or the diff of the manually installed packages with the version that was originally announced.
  • The Atom text editor was installed.
  • The packages codeblocks-contrib, colorgcc, tmux, were installed.
  • The language list was clarified to point out that Python 2 will be implemented using PyPy. (The CPython2 implementation is available on contestant machines, but it will not be used on judge machines.) Aliases mypython2 and mypython3 were added to point out to the correct python environment. However, note that the Spyder IDE with Python 2 is not configured to use mypython2 for technical reasons. All other IDEs are supposed to be configured to use the mygcc, myg++, and myjavac aliases, with the correct versions. The Python documentation was added.
  • The list of installed Python2 modules and installed Python3 modules has been changed to match more closely the modules (Python2, Python3) offered at the finals. In particular, scipy and numpy were removed. Some differences in Python modules remain between our environment and the final environment, in particular because the OS is different and because we provide iPython. The diff between our modules and the ones provided at the finals is as follows: Python2, Python3.
  • The IntelliJ IDEA Java IDE has been installed.
  • Some updates from Jessie and Jessie-backports were installed.
  • Many unnecessary packages were removed.

Team hardware

The team machines will have either 8 GB or 16 GB of RAM, of which at least 6 GB will be available in the contest environment. The machines will have one of the following CPUs:

  • Intel i3-4130 (2 cores, 4 threads, 3.4 Ghz)
  • Intel i5-4590 (4 cores, 4 threads, 2.93 Ghz)
  • Intel i7-2600 (4 cores, 8 threads, 3.4 Ghz)
  • Intel i7-6700T (4 cores, 8 threads, 2.8 Ghz)
  • Intel i7-7700T (4 cores, 8 threads, 2.9 Ghz)

The allocation of machines to teams will be drawn at random.

The team machines will be equipped either with an AZERTY keyboard or with a QWERTY keyboard, depending on the information provided by the team at registration (see the registration page for details about how to specify this). Subject to availability, the AZERTY keyboards will be one of the following two models (one, two, allocated at random) and the QWERTY keyboards will be like this. Teams are not permitted to bring their own keyboards, but may put stickers on the keyboard if they wish; see regulations.

Of course, no matter the physical layout of the keyboards, it is always possible to reconfigure them in software to a different layout (that does not match what is printed on the keyboard).

Team software

The software configuration will be as follows:

  • OS
  • Desktop
    • GNOME 3
  • Editors
    • vi/vim
    • gvim
    • emacs
    • gedit
    • Kate
    • geany
    • atom
  • Languages
    • Java
      • openjdk version "1.8.0_144"
      • OpenJDK Runtime Environment (build 1.8.0_144-8u144-b01-1-b01)
      • OpenJDK 64-Bit Server VM (build 25.144-b01, mixed mode)
    • C
      • gcc (Debian 7.2.0-11) 7.2.0
    • C++
      • g++ (Debian 7.2.0-11) 7.2.0
    • Python
  • IDEs:
    • Eclipse 3.8.1-7
      • Eclipse CDT 8.5.0-1
      • PyDev 5.7.0
    • NetBeans 8.1+dfsg3-4
      • Python 0.151118
      • C/C++ 1.29.6.1
    • IntelliJ IDEA Community Edition 2017.2.5, Build #IC-172.4343.14
    • Code::Blocks 16.01+dfsg-2~bpo8+1
    • Spyder 2.3.1
  • Debuggers: gdb, valgrind
  • Browsers: Firefox, Chromium

The computers will not have any network access except to the judging system and to the reference documentation provided.

We used to provide an Open Virtualization Archive image of the contest environment, but the image is no longer available. Please refer to the same page of the next SWERC edition for details.

Compilation flags

The judging system will compile submissions with the following options. Each exists as an alias on the team machines:

Lang Compiler Command Alias
C gcc gcc -g -O2 -Wall -Wextra -std=gnu11 -static "$@" -lm mygcc
C++ g++ g++ -g -O2 -Wall -Wextra -std=gnu++14 -static "$@" -lm myg++
Java OpenJDK javac "$@" myjavac

Language features

The following language features are not permitted in any of the contest languages:

  • inline assembly code
  • threads
  • file I/O
  • file management
  • device management
  • interprocess communication
  • forking and execution of external commands

More generally, any system call other than memory management, reading from the standard input, writing to the standard output, and exception management, is forbidden.

Submissions using any of these features will be rejected, either automatically by the judging system, or manually by the judges.

Judging hardware

Submissions will be judged on three machines, each having an Intel Xeon E5-2660 CPU (2.6 Ghz, 20 cores) and having between 256 GB and 512 GB of RAM.

Judging software

The software configuration for judge machines is a Debian Linux Jessie 64bit virtual machine with exactly the same software version as the team software above.

The contest control system that will be used is DOMjudge. Submissions will be evaluated automatically unless something unexpected happens (system crash, error in a test case, etc.).

The possible verdicts are the following, by order of priority, i.e., DomJudge will return the first matching verdict in this list:

  • Too-late: The submission was made after the end of the contest. This verdict does not lead to a time penalty.
  • Compiler-error: The contest control system failed to compile the submission. Warnings are not treated as errors. This verdict does not lead to a time penalty. Details of compilation errors will not be shown by the judging system.
  • Memory-limit: The submission exceeded the memory limit. (The precise memory limit is not specified.) This verdict leads to a 20 minute penalty.
  • Output-limit: The submission produced too much output. (The precise output limit is not specified.) This verdict leads to a 20 minute penalty.
  • Run-error: The submission failed to execute properly on a test case (segmentation fault, divide by zero, etc.). Details of the error are not shown. This verdict leads to a 20 minute penalty.
  • Timelimit: The submission exceeded the time limit on one test case, which may indicate that your code went into an infinite loop or that the approach is not efficient enough. (The precise time limit is not specified, but the order of magnitude is of a few seconds.) This verdict leads to a 20 minute penalty.
  • Wrong-answer: The submission executed properly on a test case but it did not produce the correct answer. Details are not specified. This verdict leads to a 20 minute penalty.
  • No-output: There is at least one test case where the submission executed correcly but did not produce any result; and on other test cases, it executed properly and produced the correct output. This verdict leads to a 20 minute penalty.
  • Correct: The submission was accepted. Note that this verdict may still be overridden manually by judges.

Of course, as specified in the regional rules, the 20 minute penalty is only applied for submissions to problems that are eventually solved by the team.

Note that there is no "presentation-error" verdict: errors in output format are treated as wrong answers.

If a submission is rejected, no information will be provided about the number of the test case(s) where the submission failed.

Problem set

The problem set will be provided on paper (one copy per contestant), and also in PDF files on the judge system (allowing you to copy and paste the sample inputs and outputs).

Making a submission

Once you have written code to solve a problem, you can submit it to the contest control system for evaluation. Authentication to the contest control system is done by IP under normal circumstances, so you do not need to log in. You can submit using the web interface, by opening a web browser and using the provided links/bookmarks, or you can submit by command line using the submit program. (Make sure that the file that you wish to submit has the correct name, because submit uses this to determine automatically for which problem you are submitting.)

Asking questions

If a contestant has an issue with the problem set (e.g., it is ambiguous or incorrect), they can ask a question to the judges using the clarification request mechanism of DomJudge. Usually, the judges will either decline to answer or issue a general clarification to all teams to clarify the meaning of the problem set or fix the error.

If a contestant has a technical issue with the team workstation (hardware malfunction, computer crash, etc.), they should ask the volunteer in their room for help.

Of course, neither the judges nor the volunteers will answer any requests for technical support, e.g., debugging your code, understanding a compiler error, etc.

Location and rooms

The contest will be held in Télécom ParisTech, 46 rue Barrault, 75013 Paris, in rooms located approximately here. Between six and eight different rooms will be used. The rooms are all located at the second floor of the building, and have windows (the height above ground level is roughly 5 to 10 meters).

Printing

During the contest, teams will have the possibility to request printouts, e.g., of their code. These printouts will be delivered by volunteers. Printouts can be requested within reason, i.e., as long as the requested quantities do not negatively impact contest operations.

Requests for additional software

We will not be able to honor requests to install additional software during SWERC, and it is no longer possible to make such requests.

Gold sponsors

Palantir Société Générale Criteo Labs

Silver sponsors

Almerys Télécom ParisTech

Bronze sponsor

Google Inria