3525486
doi
10.5281/zenodo.3525486
oai:zenodo.org:3525486
user-iwslt2019
Karakanta, Alina
University of Trento & Fondazione Bruno Kessler, Trento, Italy
Federico, Marcello
Fondazione Bruno Kessler, Trento, Italy
Negri, Matteo
Fondazione Bruno Kessler, Trento, Italy
Turchi, Marco
Fondazione Bruno Kessler, Trento, Italy
Adapting Multilingual Neural Machine Translation to Unseen Languages
Lakew, Surafel M.
University of Trento & Fondazione Bruno Kessler, Trento, Italy
info:eu-repo/semantics/openAccess
Creative Commons Attribution 4.0 International
https://creativecommons.org/licenses/by/4.0/legalcode
<p>Multilingual Neural Machine Translation (MNMT) for low- resource languages (LRL) can be enhanced by the presence of related high-resource languages (HRL), but the relatedness of HRL usually relies on predefined linguistic assumptions about language similarity. Recently, adapting MNMT to a LRL has shown to greatly improve performance. In this work, we explore the problem of adapting an MNMT model to an unseen LRL using data selection and model adapta- tion. In order to improve NMT for LRL, we employ perplexity to select HRL data that are most similar to the LRL on the basis of language distance. We extensively explore data selection in popular multilingual NMT settings, namely in (zero-shot) translation, and in adaptation from a multilingual pre-trained model, for both directions (LRL↔en). We further show that dynamic adaptation of the model’s vocabulary results in a more favourable segmentation for the LRL in comparison with direct adaptation. Experiments show re- ductions in training time and significant performance gains over LRL baselines, even with zero LRL data (+13.0 BLEU), up to +17.0 BLEU for pre-trained multilingual model dynamic adaptation with related data selection. Our method outperforms current approaches, such as massively multilingual models and data augmentation, on four LRL.</p>
Zenodo
2019-11-02
info:eu-repo/semantics/conferencePaper
3525485
user-iwslt2019
1579538685.705806
235536
md5:4e605da40b4505db8e668895a637119f
https://zenodo.org/records/3525486/files/IWSLT2019_paper_27.pdf
public
10.5281/zenodo.3525485
isVersionOf
doi