Conference paper Open Access
Macé, Valentin; Servan, Christophe
In Machine Translation, considering the document as a whole can help to resolve ambiguities and inconsistencies. In this paper, we propose a simple yet promising approach to add contextual information in Neural Machine Translation. We present a method to add source context that capture the whole document with accurate boundaries, taking every word into account. We provide this additional information to a Transformer model and study the impact of our method on three language pairs. The proposed approach obtains promising results in the English-German, English-French and French-English document-level translation tasks. We observe interesting cross-sentential behaviors where the model learns to use document-level information to improve translation coherence.
Name | Size | |
---|---|---|
IWSLT2019_paper_20.pdf
md5:3a4a4614e5f77e181163dafd99370339 |
168.3 kB | Download |
All versions | This version | |
---|---|---|
Views | 116 | 116 |
Downloads | 85 | 85 |
Data volume | 14.3 MB | 14.3 MB |
Unique views | 108 | 108 |
Unique downloads | 78 | 78 |