Info: Zenodo’s user support line is staffed on regular business days between Dec 23 and Jan 5. Response times may be slightly longer than normal.

Published January 22, 2021 | Version svcomp21
Dataset Open

Results of the 10th Intl. Competition on Software Verification (SV-COMP 2021)

  • 1. LMU Munich

Description

Competition Results

This file describes the contents of an archive of the 10th Competition on Software Verification (SV-COMP 2021).
https://sv-comp.sosy-lab.org/2021/

The competition was run by Dirk Beyer, LMU Munich, Germany.
More information is available in the following article:
Dirk Beyer. Software Verification: 10th Comparative Evaluation (SV-COMP 2021). In Proceedings of the 27th International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS 2021, Luxembourg, March 27 - April 1), 2021. Springer.

Copyright (C) Dirk Beyer
https://www.sosy-lab.org/people/beyer/

SPDX-License-Identifier: CC-BY-4.0
https://spdx.org/licenses/CC-BY-4.0.html

To browse the competition results with a web browser, there are two options:

  • start a local web server using php -S localhost:8000 in order to view the data in this archive, or
  • browse https://sv-comp.sosy-lab.org/2021/results/ in order to view the data on the SV-COMP web page.

Contents

  • index.html: directs to the overview web page
  • LICENSE.txt: specifies the license
  • README.txt: this file
  • results-validated/: results of validation runs
  • results-verified/: results of verification runs and aggregated results

The folder results-validated/ contains the results from validation runs:

  • *.xml.bz2: XML results from BenchExec
  • *.logfiles.zip: output from tools
  • *.json.gz: mapping from files names to SHA 256 hashes for the file content

The folder results-verified/ contains the results from verification runs and aggregated results:

  • index.html: overview web page with rankings and score table
  • *.xml.bz2: XML results from BenchExec
  • *.merged.xml.bz2: XML results from BenchExec, status adjusted according to the validation results
  • *.logfiles.zip: output from tools
  • *.json.gz: mapping from files names to SHA 256 hashes for the file content
  • *.xml.bz2.table.html: HTML views on the detailed results data as generated by BenchExec’s table generator
  • *.All.table.html: HTML views of the full benchmark set (all categories) for each tool
  • META_*.table.html: HTML views of the benchmark set for each meta category for each tool, and over all tools
  • <category>*.table.html: HTML views of the benchmark set for each category over all tools
  • iZeCa0gaey.html: HTML views per tool
  • validatorStatistics.html: Statictics of the validator runs

  • quantilePlot-*: score-based quantile plots as visualization of the results
  • quantilePlotShow.gp: example Gnuplot script to generate a plot
  • score*: accumulated score results in various formats

The hashes of the file names (in the files *.json.gz) are useful for

  • validating the exact contents of a file and
  • accessing the files from the witness store.

Other Archives

Overview over archives from SV-COMP 2021 that are available at Zenodo:

All benchmarks were executed for SV-COMP 2021 https://sv-comp.sosy-lab.org/2021/
by Dirk Beyer, LMU Munich, based on the following components:

Contact

Feel free to contact me in case of questions: https://www.sosy-lab.org/people/beyer/

Files

svcomp21-results.zip

Files (5.2 GB)

Name Size Download all
md5:5c1749a14efebcce8928582e432d1e25
5.2 GB Preview Download