Conference paper Open Access

Efficient Bilingual Generalization from Neural Transduction Grammar Induction

Yan, Yuchen; Wu, Dekai; Kumyol, Serkan

We introduce (1) a novel neural network structure for bilingual modeling of sentence pairs that allows efficient capturing of bilingual relationship via biconstituent composition, (2) the concept of neural network biparsing, which applies to not only machine translation (MT) but also to a variety of other bilingual research areas, and (3) the concept of a biparsing-backpropagation training loop, which we hypothesize that can efficiently learn complex biparse tree patterns. Our work distinguishes from sequential attention-based models, which are more traditionally found in neural machine translation (NMT) in three aspects. First, our model enforces compositional constraints. Second, our model has a smaller search space in terms of discovering bilingual relationships from bilingual sentence pairs. Third, our model produces explicit biparse trees, which enable transparent error analysis during evaluation and external tree constraints during training.

Files (384.4 kB)
Name Size
IWSLT2019_paper_8.pdf
md5:fbc4f1cc1d95dddc39d301713f962a8e
384.4 kB Download
77
61
views
downloads
All versions This version
Views 7777
Downloads 6161
Data volume 23.4 MB23.4 MB
Unique views 7171
Unique downloads 5151

Share

Cite as