Published January 8, 2022 | Version v1
Dataset Open

Results of the 4th Intl. Competition on Software Testing (Test-Comp 2022)

  • 1. LMU Munich, Germany

Description

Test-Comp 2022

Competition Results

This file describes the contents of an archive of the 4th Competition on Software Testing (Test-Comp 2022).
https://test-comp.sosy-lab.org/2022/

The competition was run by Dirk Beyer, LMU Munich, Germany.
More information is available in the following article:
Dirk Beyer. Advances in Automatic Software Testing: Test-Comp 2022. In Proceedings of the 25th International Conference on Fundamental Approaches to Software Engineering (FASE 2022, Munich, April 2 - 7), 2021. Springer.

Copyright (C) Dirk Beyer
https://www.sosy-lab.org/people/beyer/

SPDX-License-Identifier: CC-BY-4.0
https://spdx.org/licenses/CC-BY-4.0.html

To browse the competition results with a web browser, there are two options:

Contents

  • index.html: directs to the overview web page
  • LICENSE.txt: specifies the license
  • README.txt: this file
  • results-validated/: results of validation runs
  • results-verified/: results of test-generation runs and aggregated results

The folder results-validated/ contains the results from validation runs:

  • *.xml.bz2: XML results from BenchExec
  • *.logfiles.zip: output from tools
  • *.json.gz: mapping from files names to SHA 256 hashes for the file content

The folder results-verified/ contains the results from test-generation runs and aggregated results:

  • index.html: overview web page with rankings and score table

  • design.css: HTML style definitions

  • *.xml.bz2: XML results from BenchExec

  • *.merged.xml.bz2: XML results from BenchExec, status adjusted according to the validation results

  • *.logfiles.zip: output from tools

  • *.json.gz: mapping from files names to SHA 256 hashes for the file content

  • *.xml.bz2.table.html: HTML views on the detailed results data as generated by BenchExec’s table generator

  • *.All.table.html: HTML views of the full benchmark set (all categories) for each tool

  • META_*.table.html: HTML views of the benchmark set for each meta category for each tool, and over all tools

  • <category>*.table.html: HTML views of the benchmark set for each category over all tools

  • iZeCa0gaey.html: HTML views per tool

  • quantilePlot-*: score-based quantile plots as visualization of the results

  • quantilePlotShow.gp: example Gnuplot script to generate a plot

  • score*: accumulated score results in various formats

The hashes of the file names (in the files *.json.gz) are useful for

  • validating the exact contents of a file and
  • accessing the files from the witness store.

Other Archives

Overview over archives from Test-Comp 2022 that are available at Zenodo:

All benchmarks were executed for Test-Comp 2022 https://test-comp.sosy-lab.org/2022/ by Dirk Beyer, LMU Munich, based on the following components:

Contact

Feel free to contact me in case of questions: https://www.sosy-lab.org/people/beyer/

Files

testcomp22-results.zip

Files (1.6 GB)

Name Size Download all
md5:acd8773dcf57a8f04f5400a869787ca2
1.6 GB Preview Download