When preparing your artifact for submission to the RTSS 2025 Artifact Evaluation process, please adhere to the following guidelines and recommendations.
Recommendation: Use Virtual Machines
Based on previous experience, the biggest hurdle to successful reproducibility is the setup and installation of the necessary libraries and dependencies. Authors are therefore encouraged to prepare a virtual machine (VM) image including their artifact (if possible) and to make it available via HTTP throughout the evaluation process (and, ideally, afterwards).
As the basis of the VM image, please choose commonly-used OS versions that have been tested with the virtual machine software and that evaluators are likely to be accustomed to. We encourage authors to use VirtualBox and save the VM image as an Open Virtual Appliance (OVA) file.
To facilitate the preparation of the VM, we suggest using the VM images available at https://www.osboxes.org/.
Alternative: Use Docker
As a potentially more lightweight alternative to full VMs, depending on the kind of artifact, it can be appropriate to package an artifact (or a part thereof) as a Docker image, or more specifically, as an OCI container.
However, authors must not assume that all evaluators have a working Docker setup, nor may they assume that all evaluators will be familiar with launching containerized applications. Any artifact relying on containers must thus provide detailed instructions for setting up a suitable environment (for instance, by linking to Docker installation instructions) and for running the provided containerized software.
Artifact Preparation Instructions
When submitting an artifact for evaluation, please provide a document (e.g., a PDF, HTML, or text file) with instructions on how to use the artifact to reproduce the results in the paper. The document should include a link to the virtual machine image and a description of key configuration parameters for the virtual machine (RAM, number of cores, etc.), as well as the host platform on which you prepared and tested your VM image.
Please provide precise instructions on how to proceed after booting the image, including the instructions for running the artifact.
Authors are strongly encouraged to prepare readable scripts to automatically launch the experiments. In the case the experiments require a long time to complete, the authors may prepare simplified experiments (e.g., by reducing the number of samples over which the results are averaged) with shorter running times that demonstrate the same trends as observed in the complete experiments (and as reported in the paper).
If no VM image is provided, the instructions must provide fully reproducible installation instructions and execution for common OS platforms (e.g., popular Linux distributions like Debian or Ubuntu).
Finally, be sure to include a version of the accepted paper related to the artifact that is as close as possible to the final camera-ready version, reflecting as many of the reviewer comments as possible, especially in the evaluation section, so that evaluators can fully understand the context and evaluation criteria.
A good โhow-toโ guide for preparing an effective artifact evaluation package is available online at http://bit.ly/HOWTO-AEC.
The AE process is single-anonymous, meaning that evaluators remain anonymous but authors names and affiliations may be revealed to evaluators. The rationale is that a requirement to anonymize artifacts (e.g., source code comments, dependencies, data sets, etc.) would result in a significant burden for little gain.
Atypical Artifacts
If you are not in a position to prepare the artifact as above, or if your artifact requires special libraries, commercial tools (e.g., MATLAB or specific toolboxes), or particular hardware (e.g., specific GPUs), please contact the AE chair as soon as possible for case-by-case arrangements.
Further Questions
In case of any questions or concerns, or for advice on how to best package and submit complex artifacts, please contact the Artifact Evaluation Chair.