Results of the 8th Intl. Competition on Software Testing (Test-Comp 2026)
Description
Test-Comp 2026
Competition Results
This file describes the contents of an archive of the 8th Competition on Software Testing (Test-Comp 2026). https://test-comp.sosy-lab.org/2026/
The competition was organized by Dirk Beyer, LMU Munich, Germany. More information is available in the following article: Dirk Beyer. Evaluating Tools for Automatic Software Testing (Report on Test-Comp 2026). In Proceedings of the 29th International Conference on Fundamental Approaches to Software Engineering (FASE 2026, Turin, Italy, April 11–16), 2026. Springer.
Copyright (C) 2026 Dirk Beyer https://www.sosy-lab.org/people/beyer/
SPDX-License-Identifier: CC-BY-4.0 https://spdx.org/licenses/CC-BY-4.0.html
To browse the competition results with a web browser, there are two options:
- start a local web server using php -S localhost:8000 in order to view the data in this archive, or
- browse https://test-comp.sosy-lab.org/2026/results/ in order to view the data on the Test-Comp web page.
Contents
index.html: directs to the overview web pageLICENSE-results.txt: specifies the licenseREADME-results.txt: this fileresults-validated/: results of validation runsresults-verified/: results of test-generation runs and aggregated results
The folder results-validated/ contains the results from validation runs:
*.results.txt: TXT results from BenchExec*.xml.bz2: XML results from BenchExec*.logfiles.zip: output from tools*.json.gz: mapping from files names to SHA 256 hashes for the file content
The folder results-verified/ contains the results from test-generation runs and aggregated results:
-
index.html: overview web page with rankings and score table -
design.css: HTML style definitions -
*.results.txt: TXT results from BenchExec -
*.xml.bz2: XML results from BenchExec -
*.fixed.xml.bz2: XML results from BenchExec, status adjusted according to the validation results -
*.csv: CSV results -
*.fixed.csv: CSV results, status adjusted according to the validation results -
*.logfiles.zip: output from tools -
*.json.gz: mapping from files names to SHA 256 hashes for the file content -
*.xml.bz2.table.html: HTML views on the detailed results data as generated by BenchExec’s table generator -
<tester>*.table.html: HTML views of the full benchmark set (all categories) for each tester -
META_*.table.html: HTML views of the benchmark set for each meta category for each tester, and over all testers -
<category>*.table.html: HTML views of the benchmark set for each category over all testers -
*.xml: XML table definitions for the above tables -
results-per-tool.php: List of results for each tool for review process in pre-run phase -
<tester>.list.html: List of results for a tool in HTML format with links -
quantilePlot-*: score-based quantile plots as visualization of the results -
quantilePlotShow.gp: example Gnuplot script to generate a plot -
score*: accumulated score results in various formats
The hashes of the file names (in the files *.json.gz) are useful for
- validating the exact contents of a file and
- accessing the files from the witness store.
Related Archives
Overview of archives from Test-Comp 2026 that are available at Zenodo:
- https://doi.org/10.5281/zenodo.18650733 Test Suites from Test-Generation Tools (Test-Comp 2026). Store of coverage witnesses (containing the generated test suites)
- https://doi.org/10.5281/zenodo.18650756 FM-Tools Release 2.3: Data Set of Metadata about Tools for Formal Methods (SV-COMP 2026, Test-Comp 2026). Metadata snapshot of the evaluated tools (DOIs, options, etc.)
- https://doi.org/10.5281/zenodo.18650772 Results of the 8th Intl. Competition on Software Testing (Test-Comp 2026). Results (XML result files, log files, file mappings, HTML tables)
- https://doi.org/10.5281/zenodo.18650775 SV-Benchmarks: Benchmark Set for Software Verification and Testing (SV-COMP 2026, Test-Comp 2026). Test-generation tasks, version testcomp26
- https://doi.org/10.5281/zenodo.18455156 sosy-lab/benchexec: Release 3.34 Benchmarking framework
- https://doi.org/10.5281/zenodo.18650812 FM-Weck: Release 1.6.0 Containerized execution and continuous testing of formal-methods tools
All benchmarks were executed for Test-Comp 2026 https://test-comp.sosy-lab.org/2026/ by Dirk Beyer, LMU Munich, based on the following components:
- https://gitlab.com/sosy-lab/benchmarking/fm-tools 2.3
- https://gitlab.com/sosy-lab/benchmarking/sv-benchmarks testcomp26
- https://gitlab.com/sosy-lab/test-comp/bench-defs testcomp26
- https://gitlab.com/sosy-lab/software/benchexec 3.34
- https://gitlab.com/sosy-lab/software/benchcloud 1.5.0
- https://gitlab.com/sosy-lab/software/fm-weck 1.6.0
- https://gitlab.com/sosy-lab/benchmarking/competition-scripts testcomp26
- https://gitlab.com/sosy-lab/test-comp/test-format testcomp26
Contact
Feel free to contact me in case of questions: https://www.sosy-lab.org/people/beyer/
Files
testcomp26-results.zip
Files
(16.5 GB)
| Name | Size | Download all |
|---|---|---|
|
md5:3438cb5d3d6c05b3cbb0cf4eadbdcc59
|
16.5 GB | Preview Download |