UPDATE: Zenodo migration postponed to Oct 13 from 06:00-08:00 UTC. Read the announcement.
There is a newer version of this record available.

Dataset Open Access

Replication Package for "Trust Enhancement Issues in Program Repair"

Noller, Yannic; Shariffdeen, Ridwan; Gao, Xiang; Roychoudhury, Abhik

This is the replication artifact for our work on "Trust Enhancement Issues in Program Repair". The corresponding paper has been published at the International Conference of Software Engineering (ICSE) 2022, and is available under the following URL: https://doi.org/10.1145/3510003.3510040. A pre-print of our work is available on arXiv: https://arxiv.org/pdf/2108.13064.pdf.

The artifacts is organized in two parts:

  1. the artifacts for our developer survey, and
  2. the artifacts for our empirical assessment of state-of-the-art automated program repair (APR) techniques.


1. Survey Artifacts

The survey folder includes:

  • Survey_Form.pdf -- It shows the PDF version of the web form of our survey.
  • Study_Results.pdf -- It shows a summary of the questions and responses.
  • Codebooks.xlsx -- It shows all created codebooks.
  • CodedResults.xlsx -- It shows the responses for all questions, the corresponding coding, and statistics we applied during our analysis. Additionally, it includes plots for all responses and also the plots that are included in our paper.


2. Experiment Artifacts

The experiments folder includes:

  • tools.md -- It lists and describes the APR techniques that we used in our experiments.
  • Results.xlsx -- Contains the results of the experiments for each tool we considered, configuration details, and all the data from the ManyBugs benchmark.
  • protocols/ -- This folder includes the analysis protocols, which describe for each tool "how" we extracted the values for our evaluation metrics (see Table 3 in our paper).
  • results/ -- Contains the log files and relevant outputs for all tools and configurations. In particular, it includes the generated patches.
  • subjects/ -- Contains the subjects taken from the ManyBugs benchmark. For our experiments, we made some changes to the instrumentations, test-ids, etc. The file meta-data.json states the configurations, relevant test cases, etc.
Files (254.5 MB)
Name Size
254.5 MB Download
All versions This version
Views 264135
Downloads 4330
Data volume 16.0 GB7.6 GB
Unique views 191112
Unique downloads 3930


Cite as