Published September 30, 2023 | Version v3
Dataset Open

Understanding API Usage and Testing: An Empricial Study of C Libraries (Artifact)

  • 1. ROR icon Imperial College London

Description

LibProbe Artifact

For the sake of the evaluation we preprocessed the CCScanner data to identify all clients of the libraries used in our evaluation and included those in our MongoDB database. This artifact will start by importing this pre-processed dependency information into a docker image which is then used for the evaluation.

Running the docker image

  • First load the docker image by running

    • gunzip -c libprobe_v2.2.tar.gz | sudo docker import - libprobe:latest

    this will load the image in your local docker images.

  • Run a container from the image: sudo docker run -it --device /dev/snd --privileged libprobe:latest /bin/bash This will run a docker container which maps the pulseaudio and alsa configurations from your local host to the docker image. This is necessary to get some clients for some target libraries to build correctly. You will need an ubuntu host machine which has pulseaudio and alsa installed.

  • Run mongod --fork --logpath /var/log/mongodb/mongod.log followed by mongorestore --drop --db apiusage /tmp/libprobe/database/apiusage to import the results saved in the artifact. 

Validating analysis results

  • The docker image provided does not contain any clients due to size limitations on sharing. The Mongo database contains all the results of running this evaluation.
  • We provide a script clone_clients.py with a client_repos.json file in /tmp/libprobe/extra which can be used to clone the clients in the clients directory. 
  • To get the results it's possible to run python3 libprobe.py analyse all -n from /tmp/libprobe. This will overwrite the JSON files in the json_files directory and overwrite the graphs in the graphs directory.

Running the evaluation for one library (vorbis)

  • Download clients: Go to /tmp/libprobe and run python3 libprobe.py download vorbis
  • Prepare the library:
    • In /tmp/data/libraries/xiph@@vorbis run make clean then make and make check and make install.
    • Copy all C files from /tmp/data/libraries/xiph@@vorbis/lib to /tmp/data/libraries/xiph@@vorbis/lib/.libs to collect accurate API coverage information.
  • Process the library to get the APIs and the coverage information : python3 libprobe.py processlib vorbis
  • Prepare clients for excluding sub directories that might contain vorbis library code: python3 libprobe.py prepclients vorbis
  • Get client usages: python3 libprobe.py fetchusages vorbis
  • Analyse: python3 libprobe.py analyse vorbis -n
  • (optional) Measure differential coverage for improved coverage libs: python3 libprobe.py coverage vorbis

Running the evaluation for all libraries (this requires at least 300GB of disk space)

  • Download clients: Go to \tmp\libprobe and run python3 libprobe.py download all
  • Process the libraries: python3 libprobe.py processlib all
  • Prepare clients: python3 libprobe.py prepclients all
  • Get usages: python3 libprobe.py fetchusages all
  • Analyse: python3 libprobe.py analyse all -n
  • (optional) Measure differential coverage for improved coverage libs: python3 libprobe.py coverage <library>

Getting baseline coverage for libraries

All libraries are cloned in /tmp/data/libraries and clients are cloned in /tmp/data/clients.

 
  • MBedtls: Copy the script coverage_mbedtls.sh from /tmp/libprobe/extra into the the build directory of Mbedtls /tmp/data/libraries/Mbed-TLS@@mbedtls/build and run ./coverage_mbedtls.sh followed by genhtml baseline.info --output-directory out this will calculate the baseline coverage for mbedtls.
  • FFTW: Copy the script coverage.sh from /tmp/libprobe/extra into the the root dir of FFTW and run ./coverage.sh baseline this will calculate the baseline coverage for fftw
  • HDF5: Copy the script coverage_hdf.sh from /tmp/libprobe/extra into the the root dir of HDF and run ./coverage_hdf.sh baseline this will calculate the baseline coverage for HDF.
  • LMDB: Copy the script coverage_lmdb.sh from /tmp/libprobe/extra into the /tmp/data/libraries/LMDB@@lmdb/libraries/liblmdb and run ./coverage_lmdb.sh baseline this will calculate the baseline coverage for LMDB.
  • Zip: Copy the script coverage_zip.sh from /tmp/libprobe/extra into /tmp/data/libraries/kuba--@@zip/build/CMakeFiles/zip.dir/src and run ./coverage_zip.sh baseline this will calculate the baseline coverage for zip.
  • Vorbis: Copy the script cal_cov.py from /tmp/libprobe/extra to /tmp/data/libraries/xiph@@vorbis/lib/.libs and then copy all source files in the .libs folder by running cp ../*.c . from the .libs folder. Finally run python3 cal_cov.py ..
  • XXhash: Copy the script coverage.sh from /tmp/libprobe/extra into /tmp/data/libraries/Cyan4973@@xxHash and run ./coverage.sh baseline this will calculate the baseline coverage for xxhash

Reproducing increased coverage using clients

  • LMDB: The client we will use is Knot DNS.

    • Change directory to /tmp/data/clients/CZ-NIC@@knot and run autogen.sh.
    • Run ./configure --with-lmdb=/usr/local.
    • Then make && make check.

    Now go back to the LMDB directory and run

    • Run ./coverage_lmdb.sh after_knot.

    • Now go /tmp/libprobe and run python3 libprobe.py coverage lmdb

  • VORBIS: The client we will use in SFML.

    Go to the vorbis library dir /tmp/data/libraries/xiph@@vorbis and run make clean.

    • Run make && make check && make install.
    • Go to the .libs folder and copy all c files there by doing cp ../*.c ..
    • Copy /tmp/libprobe/extra/cal_cov.py into the .libs folder and run python3 cal_cov.py .. This will show the baseline coverage.

    Go to the SFML directory /tmp/data/clients/SFML@@SFML.

    • Create build directory mkdir build && cd build.
    • Run cmake -DSFML_BUILD_TEST_SUITE=TRUE -GNinja ...
    • Run ninja.
    • Run ctest. You will see some failing tests. Thats normal as we are only interested in the Audio tests for vorbis. All Audio tests should pass.

    Go back to the .libs folder in vorbis and re-run the cal_cov.py script.

    • Now go /tmp/libprobe and run python3 libprobe.py coverage vorbis
  • SDL: The client we will use in UFOAI.
    • Go to the SDL library dir /tmp/data/libraries/libsdl-org\@\@SDL and then the build2 directory where the built library is. .

    • Run make clean && make && make install && make test.
    • Copy /tmp/libprobe/extra/coverage_sdl.sh into the build2 folder and run ./coverage_sdl.sh baseline. Run genhtml baseline.info --output-directory out

          Go to the UFOAI directory /tmp/data/clients/ufoaiorg\@\@ufoai

    • Run ./configure --target-os=linux --disable-uforadiant && make.
    • Run ./testall

                  Now go /tmp/libprobe and run python3 libprobe.py coverage sdl

 

  • FFTW: The client we will use in CAVA.

                 Go to the FFTW3 library dir /tmp/data/libraries/FFTW@@fftw3 and run reset_cov.sh then make clean

    • Run make && make install && make check.
    • Copy /tmp/libprobe/extra/coverage.sh into root directory of the library and run the .libs folder and run ./coverage.sh baseline This will show the baseline coverage.

                 Go to the CAVA directory /tmp/data/clients/karlstav@@cava.

    • Run ./autogen.sh followed by ./configure then make
    • Run the script ./run_all_tests.sh

                Go back to the FFTW3 library and run ./coverage.sh after_cava

    • Now go /tmp/libprobe and run python3 libprobe.py coverage fftw3

Files

Files (13.4 GB)

Name Size Download all
md5:21c526f313c1896f1be292b687e2e051
13.4 GB Download

Additional details

Dates

Available
2024-06-06

Software

Development Status
Wip

References

  • LibProbe