Macé, Valentin
Servan, Christophe
2019-11-02
<p>In Machine Translation, considering the document as a whole can help to resolve ambiguities and inconsistencies. In this paper, we propose a simple yet promising approach to add contextual information in Neural Machine Translation. We present a method to add source context that capture the whole document with accurate boundaries, taking every word into account. We provide this additional information to a Transformer model and study the impact of our method on three language pairs. The proposed approach obtains promising results in the English-German, English-French and French-English document-level translation tasks. We observe interesting cross-sentential behaviors where the model learns to use document-level information to improve translation coherence.</p>
https://doi.org/10.5281/zenodo.3525020
oai:zenodo.org:3525020
eng
Zenodo
https://zenodo.org/communities/iwslt2019
https://doi.org/10.5281/zenodo.3525019
info:eu-repo/semantics/openAccess
Creative Commons Attribution 4.0 International
https://creativecommons.org/licenses/by/4.0/legalcode
Using Whole Document Context in Neural Machine Translation
info:eu-repo/semantics/conferencePaper