Conference paper Open Access

KIT's Submission to the IWSLT 2019 Shared Task on Text Translation

Schneider, Felix; Waibel, Alex

Dublin Core Export

<?xml version='1.0' encoding='utf-8'?>
<oai_dc:dc xmlns:dc="" xmlns:oai_dc="" xmlns:xsi="" xsi:schemaLocation="">
  <dc:creator>Schneider, Felix</dc:creator>
  <dc:creator>Waibel, Alex</dc:creator>
  <dc:description>In this paper, we describe KIT’s submission for the IWSLT 2019 shared task on text translation. Our system is based on the transformer model [1] using our in-house implementation. We augment the available training data using back-translation and employ fine-tuning for the final model. For our best results, we used a 12-layer transformer-big config- uration, achieving state-of-the-art results on the WMT2018 test set. We also experiment with student-teacher models to improve performance of smaller models.</dc:description>
  <dc:title>KIT's Submission to the IWSLT 2019 Shared Task on Text Translation</dc:title>
All versions This version
Views 116115
Downloads 8686
Data volume 8.2 MB8.2 MB
Unique views 102101
Unique downloads 8080


Cite as