Info: Zenodo’s user support line is staffed on regular business days between Dec 23 and Jan 5. Response times may be slightly longer than normal.

Published February 10, 2023 | Version svcomp23
Dataset Open

Results of the 12th Intl. Competition on Software Verification (SV-COMP 2023)

  • 1. LMU Munich, Germany

Description

SV-COMP 2023

Competition Results

This file describes the contents of an archive of the 12th Competition on Software Verification (SV-COMP 2023). https://sv-comp.sosy-lab.org/2023/

The competition was organized by Dirk Beyer, LMU Munich, Germany. More information is available in the following article: Dirk Beyer. Competition on Software Verification and Witness Validation: SV-COMP 2023. In Proceedings of the 29th International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS 2023, Munich, April 22 - 27), 2023. Springer.

Copyright (C) Dirk Beyer https://www.sosy-lab.org/people/beyer/

SPDX-License-Identifier: CC-BY-4.0 https://spdx.org/licenses/CC-BY-4.0.html

To browse the competition results with a web browser, there are two options:

Contents

  • index.html: directs to the overview web page of the verification track
  • index-validation.html: directs to the overview web page of the validation track
  • LICENSE-results.txt: specifies the license
  • README-results.txt: this file
  • results-validated/: results of validation runs
  • results-verified/: results of verification runs

The folder results-validated/ contains the results from validation runs:

  • index.html: overview web page with rankings and score table

  • design.css: HTML style definitions

  • *.results.txt: TXT results from BenchExec

  • *.xml.bz2: XML results from BenchExec

  • *.fixed.xml.bz2: XML results from BenchExec, status adjusted according to the validation results

  • *.logfiles.zip: output from tools

  • *.json.gz: mapping from files names to SHA 256 hashes for the file content

  • <validator>*.table.html: HTML views of the full benchmark set (all categories) for each validator

  • <category>*.table.html: HTML views of the benchmark set for each category over all validators

  • *.xml: XML table definitions for the above tables

  • validators.*: Statistics of the validator runs (obsolete)

  • .correctness.: Infix for validation of correctness witnesses

  • .violation.: Infix for validation of violation witnesses

  • quantilePlot-*: score-based quantile plots as visualization of the results

  • quantilePlotShow.gp: example Gnuplot script to generate a plot

  • score*: accumulated score results in various formats

The folder results-verified/ contains the results from verification runs and aggregated results:

  • index.html: overview web page with rankings and score table

  • design.css: HTML style definitions

  • *.results.txt: TXT results from BenchExec

  • *.xml.bz2: XML results from BenchExec

  • *.fixed.xml.bz2: XML results from BenchExec, status adjusted according to the validation results

  • *.logfiles.zip: output from tools

  • *.json.gz: mapping from files names to SHA 256 hashes for the file content

  • *.xml.bz2.table.html: HTML views on the detailed results data as generated by BenchExec’s table generator

  • <verifier>*.table.html: HTML views of the full benchmark set (all categories) for each verifier

  • META_*.table.html: HTML views of the benchmark set for each meta category for each verifier, and over all verifiers

  • <category>*.table.html: HTML views of the benchmark set for each category over all verifiers

  • *.xml: XML table definitions for the above tables

  • validatorStatistics.html: Statistics of the validator runs (obsolete)

  • results-per-tool.php: List of results for each tool for review process in pre-run phase

  • <tool>.list.html: List of results for a tool in HTML format with links

  • quantilePlot-*: score-based quantile plots as visualization of the results

  • quantilePlotShow.gp: example Gnuplot script to generate a plot

  • score*: accumulated score results in various formats

The hashes of the file names (in the files *.json.gz) are useful for

  • validating the exact contents of a file and
  • accessing the files from the witness store.

Other Archives

Overview over archives from SV-COMP 2023 that are available at Zenodo:

All benchmarks were executed for SV-COMP 2023 https://sv-comp.sosy-lab.org/2023/ by Dirk Beyer, LMU Munich, based on the following components:

Contact

Feel free to contact me in case of questions: https://www.sosy-lab.org/people/beyer/

Files

svcomp23-results.zip

Files (20.9 GB)

Name Size Download all
md5:1c76a62dabec947899657fd476e86238
20.9 GB Preview Download