Conference paper Open Access

A Structural Model for Contextual Code Changes

Brody, Shaked; Alon, Uri; Yahav, Eran

We address the problem of predicting edit completions based on a learned model that was trained on past edits.
Given a code snippet that is partially edited, our goal is to predict a completion of the edit for the rest of the
snippet. We refer to this task as the EditCompletion task and present a novel approach for tackling it. The
main idea is to directly represent structural edits. This allows us to model the likelihood of the edit itself, rather
than learning the likelihood of the edited code. We represent an edit operation as a path in the program’s Abstract
Syntax Tree (AST), originating from the source of the edit to the target of the edit. Using this representation, we
present a powerful and lightweight neural model for the EditCompletion task.


We conduct a thorough evaluation, comparing our approach to a variety of representation and modeling
approaches that are driven by multiple strong models such as LSTMs, Transformers, and neural CRFs. Our
experiments show that our model achieves 28% relative gain over state-of-the-art sequential models and 2×
higher accuracy than syntactic models that learn to generate the edited code instead of modeling the edits
directly. We make our code, dataset, and trained models publicly available.

Files (97.8 MB)
Name Size
A-Structural-Model-for-Contextual-Code-Changes-Artifact.zip
md5:ccf653c792052aa5b067f2ac4aef65a9
97.7 MB Download
LICENSE
md5:0bf5aa65d04262aa2bf7f26789f74aca
1.1 kB Download
README.md
md5:07372d09a48472a0e35bf78142c19bce
15.1 kB Download
73
16
views
downloads
All versions This version
Views 7373
Downloads 1616
Data volume 488.8 MB488.8 MB
Unique views 7070
Unique downloads 1313

Share

Cite as