Inference of Resource Management Specifications
Creators
- 1. University of California, Riverside
- 2. Microsoft Research
- 3. UW CSE
- 4. New Jersey Institute of Technology
Description
This upload is a docker image containing the artifact accompanying our OOPSLA-2023 paper "Inference of Resource Management Specifications". To run the image, 0.) Install Docker following the directions at [https://www.docker.com/get-started] for your OS, if it is not already installed. We have tested the artifact with Docker Desktop on MacOS, but it should work for other operating systems. For Java experiments:
- Unzip the provided Docker image. `gunzip -c path/to/resource_leak_inference.tar.gz > resource_leak_inference.tar`
- Load it into Docker. `docker load < resource_leak_inference.tar`
- Run the image. This should open a bash shell, at the home directory of user `oopsla`. `docker run -it nargeshdb/resource_leak_inference:latest`
Instructions for how to run the paper's experiments are inside the container in the `rlci-paper/README.md` file in the `oopsla` user's home directory.
Note: Our Java implementation is currently undergoing code review so that it can be incorporated into the
Checker Framework. A future release of the framework will include it. You can see the checker-framework instruction to reuse our tool on other benchmarks.
For C# experiments:
- Unzip the provided Docker image. `gunzip -c path/to/oopsla-artifact-csharp-471.tar.gz > oopsla-artifact-csharp-471.tar`
- Load it into Docker. `docker load --input oopsla-artifact-csharp-471.tar`
- Run the image. This should open a bash shell, at the home directory of user `oopsla`. `docker run -it --user oopsla oopsla-artifact-csharp-471 /bin/bash`
Instructions for how to run the paper's experiments are inside the container in the `README.md` file in the `oopsla` user's home directory.
The `Manual-Report.xlsx` contains data presented in table 2 that cannot be generated automatically and was part of manual investigation.