Published August 4, 2025 | Version v1
Conference paper Open

The NFDIxCS Artifact Evaluation Platform

  • 1. Karlsruhe Institute of Technology
  • 2. University of Potsdam
  • 3. Univ. Duisburg-Essen
  • 1. Nationale Forschungsdateninfrastruktur (NFDI) e.V.
  • 2. University of Amsterdam

Description

To achieve good and FAIR research data or software (artifacts), these artifacts should exhibit certain levels of quality. Therefore, artifact evaluation tracks have been established in conferences as a means for quality assurance. In these tracks, researchers (authors) submit their artifacts, which are reviewed by other researchers (reviewers). With successful reviews, authors earn badges, which express met quality criteria. To support artifact evaluation tracks, the NFDIxCS consortium provides a self-hosted instance of HotCRP as the NFDIxCS Artifact Evaluation Platform (AEP). All components and extensions of the AEP are open source. HotCRP is a web-based platform for managing conference review processes. In the context of the AEP, we apply it for artifact evaluations to provide an infrastructure for these processes. Concretely, the chairs of an artifact evaluation track can setup and configure a dedicated page for their track. In addition, they can customize forms for submitting artifacts and reviews to match a conference's needs and criteria. This enables chairs, for example, to include the selection of badges in both forms. As a consequence, authors can select the badges for which they apply, and reviewers can select the badges with which the authors should be awarded. Usually, the artifacts themselves are uploaded on a different platform, intended for, for instance, long term archival, and authors only put a link to the artifact in the AEP. After all artifacts have been submitted, reviewers can bid for the artifacts that they want to review. Based on the bids, the chairs can assign the actual reviewers to artifacts so that they can start reviewing. During the review, reviewers can download the artifacts via the links from the submission forms. Furthermore, authors and reviewers can exchange pseudonymized comments to clarify issues or questions. Authors are also usually allowed to update their artifacts if necessary (e.g., to fix a bug). Reviewers complete their review by filling out the review form. The AEP supports both single and double blind reviews. Based on the reviews, the chairs can decide on the awarded badges and can notify the authors. With the AEP, we have already hosted the artifact evaluation tracks of the International Conference on Software Architecture 2024 and 2025. Currently, we are conducting a survey within the latest supported track to gather feedback and aspects to improve. Additionally, we explore Cloud Development Environments for reviewing artifacts in an cloud-based environment to ease the review process. Moreover, we also work on improving the technical setup of the AEP to enable other institutions to host their own instance of the AEP. Lastly, we are integrating the Research Data Management Container from NFDIxCS into artifact evaluations to facilitate reviews for such containers.

Files

CoRDI_2025_paper_278.pdf

Files (85.3 kB)

Name Size Download all
md5:de9215416b7beba32454a375cee08082
85.3 kB Preview Download