Reproduction Package for the FSE 2024 Paper "EyeTrans: Merging Human and Machine Attention for Neural Code Summarization"
Contributors
Contact person:
Data collectors:
Data curator:
Project leader:
Project managers:
Description
This artifact accompanies our paper "EyeTrans: Merging Human and Machine Attention for Neural Code Summarization," which has been accepted for presentation at the ACM International Conference on the Foundations of Software Engineering (FSE) 2024.
The artifact contains the dataset derived from a human study using eye-tracking for code comprehension, crucial for the development of the EyeTrans model. Additionally, it includes the source code related to the research questions addressed within our work.
This includes the unprocessed data from the eye-tracking study, scripts for data processing, and the source code for the EyeTrans model, which merges human and machine attention within Transformer models. This resource is intended for researchers aiming to replicate our study, conduct further inquiry, or extend the techniques to new datasets in software engineering research.
Files
README.md
Files
(376.5 MB)
Name | Size | Download all |
---|---|---|
md5:9c0c1ddf989a2c27a9ef892661b91e37
|
376.5 MB | Preview Download |
md5:33419330d051f39ec6d858455bc5df84
|
7.3 kB | Preview Download |
Additional details
Funding
- U.S. National Science Foundation
- Collaborative Research: SHF: Medium: Towards More Human-like AI Models of Source Code 2211429
- U.S. National Science Foundation
- Collaborative Research: SHF: Medium: Towards More Human-like AI Models of Source Code 2211428
- U.S. National Science Foundation
- Collaborative Research: SHF: Small: Context-aware Models of Source Code Summarization 2100035
- U.S. National Science Foundation
- Collaborative Research: SaTC: EDU: RoCCeM: Bringing Robotics, Cybersecurity and Computer Science to the Middle School Classroom SaTC-2312057
Dates
- Accepted
-
2024-01-23Paper accepted to FSE'24
Software
- Programming language
- Python, Java
- Development Status
- Active