Results of the 15th Intl. Competition on Software Verification (SV-COMP 2026)
Authors/Creators
Description
SV-COMP 2026
Changelog
Version 2 of this archive (from 2026-04-25)
- contains the CSV files, which were missing in version 1,
- suppresses the counts of correct_true and correct_false for the categories C.FalseOverall and C.TrueOverall, respectively, in the RSF file and CSV files,
- usees two columns in the legends for quantile plots if there are many tools, and
- other minor fixes.
Competition Results
This file describes the contents of an archive of the 15th Competition on Software Verification (SV-COMP 2026). https://sv-comp.sosy-lab.org/2026/
The log files of the validation track are available in this archive: 10.5281/zenodo.18653053.
The competition was organized by Dirk Beyer, LMU Munich, Germany and Jan Strejček, Masaryk University, Czechia. More information is available in the following article: Dirk Beyer and Jan Strejček. Evaluating Software Verifiers for C, Java, and SV-LIB (Report on SV-COMP 2026). In Proceedings of the 32nd International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS 2026, Turin, Italy, April 11–16), 2026. Springer.
Copyright (C) 2026 Dirk Beyer and Jan Strejček https://www.sosy-lab.org/people/beyer/ https://www.fi.muni.cz/~xstrejc/
SPDX-License-Identifier: CC-BY-4.0 https://spdx.org/licenses/CC-BY-4.0
To browse the competition results with a web browser, there are two options:
- start a local web server using php -S localhost:8000 in order to view the data in this archive, or
- browse https://sv-comp.sosy-lab.org/2026/results/ in order to view the data on the SV-COMP web page.
Contents
index.html: directs to the overview web pages of the verification and validation trackLICENSE-results.txt: specifies the licenseREADME-results.txt: this fileresults-validated/: results of validation runsresults-verified/: results of verification runs
The folder results-validated/ contains the results from validation runs:
-
index.html: overview web page with rankings and score table -
design.css: HTML style definitions -
*.results.txt: TXT results from BenchExec -
*.xml.bz2: XML results from BenchExec -
*.fixed.xml.bz2: XML results from BenchExec, status adjusted according to the validation results -
*.csv: CSV results -
*.fixed.csv: CSV results, status adjusted according to the validation results -
*.logfiles.zip: output from tools -
*.json.gz: mapping from files names to SHA 256 hashes for the file content -
<validator>*.table.html: HTML views of the full benchmark set (all categories) for each validator -
<category>*.table.html: HTML views of the benchmark set for each category over all validators -
*.xml: XML table definitions for the above tables -
validators.*: Statistics of the validator runs (obsolete) -
correctness: Infix for validation of correctness witnesses -
violation: Infix for validation of violation witnesses -
1.0: Infix for validation of v1.0 witnesses -
2.0: Infix for validation of v2.0 witnesses -
quantilePlot-*: score-based quantile plots as visualization of the results -
quantilePlotShow.gp: example Gnuplot script to generate a plot -
score*: accumulated score results in various formats
The folder results-verified/ contains the results from verification runs and aggregated results:
-
index.html: overview web page with rankings and score table -
design.css: HTML style definitions -
*.results.txt: TXT results from BenchExec -
*.xml.bz2: XML results from BenchExec -
*.fixed.xml.bz2: XML results from BenchExec, status adjusted according to the validation results -
*.logfiles.zip: output from tools -
*.json.gz: mapping from files names to SHA 256 hashes for the file content -
*.xml.bz2.table.html: HTML views on the detailed results data as generated by BenchExec’s table generator -
<verifier>*.table.html: HTML views of the full benchmark set (all categories) for each verifier -
META_*.table.html: HTML views of the benchmark set for each meta category for each verifier, and over all verifiers -
<category>*.table.html: HTML views of the benchmark set for each category over all verifiers -
*.xml: XML table definitions for the above tables -
results-per-tool.php: List of results for each tool for review process in pre-run phase -
<verifier>.list.html: List of results for a tool in HTML format with links -
quantilePlot-*: score-based quantile plots as visualization of the results -
quantilePlotShow.gp: example Gnuplot script to generate a plot -
score*: accumulated score results in various formats
The hashes of the file names (in the files *.json.gz) are useful for
- validating the exact contents of a file and
- accessing the files from the witness store.
Related Archives
Overview of archives from SV-COMP 2026 that are available at Zenodo:
- https://doi.org/10.5281/zenodo.18651735 Verification Witnesses from Verification Tools (SV-COMP 2026). Witness store (containing the generated verification witnesses)
- https://doi.org/10.5281/zenodo.18650756 FM-Tools Release 2.3: Data Set of Metadata about Tools for Formal Methods (SV-COMP 2026, Test-Comp 2026). Metadata snapshot of the evaluated tools (DOIs, options, etc.)
- https://doi.org/10.5281/zenodo.18651757 Results of the 15th Intl. Competition on Software Verification (SV-COMP 2026). Results (XML result files, log files, file mappings, HTML tables)
- https://doi.org/10.5281/zenodo.18650775 SV-Benchmarks: Benchmark Set for Software Verification and Testing (SV-COMP 2026, Test-Comp 2026). Verification tasks, version svcomp26
- https://doi.org/10.5281/zenodo.18455156 sosy-lab/benchexec: Release 3.34 Benchmarking framework
- https://doi.org/10.5281/zenodo.18650812 FM-Weck: Release 1.6.0 Containerized execution and continuous testing of formal-methods tools
All benchmarks were executed for SV-COMP 2026 https://sv-comp.sosy-lab.org/2026/ by Dirk Beyer, LMU Munich, based on the following components:
- https://gitlab.com/sosy-lab/benchmarking/fm-tools 2.3
- https://gitlab.com/sosy-lab/benchmarking/sv-benchmarks svcomp26
- https://gitlab.com/sosy-lab/sv-comp/bench-defs svcomp26
- https://gitlab.com/sosy-lab/software/benchexec 3.34
- https://gitlab.com/sosy-lab/software/benchcloud 1.5.0
- https://gitlab.com/sosy-lab/software/fm-weck 1.6.0
- https://gitlab.com/sosy-lab/benchmarking/sv-witnesses 2.1.2
- https://gitlab.com/sosy-lab/benchmarking/competition-scripts svcomp26
Contact
Feel free to contact me in case of questions: https://www.sosy-lab.org/people/beyer/
Files
svcomp26-results.zip
Files
(29.5 GB)
| Name | Size | Download all |
|---|---|---|
|
md5:61aa870303d56a2f288727aa46dbfefb
|
29.5 GB | Preview Download |