Service Incident: New DOI registrations are working again. Re-registration of failed DOI registrations (~500) are still affected by the service incident at DataCite (our DOI registration agency).
Published January 2, 2023 | Version v1
Software Open

A Deep Learning Model for Loop Interchange: Paper Artifact

  • 1. NYU Abu Dhabi, UAE; ESI, Algeria
  • 2. NYU Abu Dhabi, UAE
  • 3. ESI, Algeria
  • 4. MIT, USA

Description

This artifact introduces the model presented in the paper: A Deep Learning Model for Loop Interchange, published in CC23 conference, dedicated to predicting the best loop interchange instance for a Tiramisu program given as input. It reproduces the model’s training using the provided datasets, as well as all tests, performed on both the test set and the benchmark. It uses Python and PyTorch mainly.

This tool is presented through python scripts and pickle/json datasets. We present the different scripts in the following:

  • Model_training.py: It requires no input, provided that all scripts and dataset files are in the same folder, locally, and all default values are being used. It outputs the model with a pickle format. It shows throughout execution the loss values that the model is getting in both the training and the validation set, as well as the accuracy of the resulting model on both sets by the end.
  • Model_tests.py: It uses the default name of the pickle model (produced in the precedent script) to perform the tests described in the paper. It outputs: the results of the tests (on both the synthetic test set and the benchmark): the accuracy and the search performance. Moreover, it outputs a text file presenting the results for the search performance.
  • Utils.py: Helper functions

The artifact is accessible via this link: https://github.com/Tiramisu-Compiler/tiramisu/tree/master/utils/specialized_models/loop_interchange The details of installations and use are presented in the readME file of the repository.

Files

Files (254.9 MB)

Name Size Download all
md5:294d2f0f4c4c0abd1afd9272da5ce30f
254.9 MB Download

Additional details

Related works

Is described by
Conference paper: 10.1145/3578360.3580257 (DOI)