Published November 2, 2019 | Version v1
Conference paper Open

KIT's Submission to the IWSLT 2019 Shared Task on Text Translation

  • 1. Karlsruhe Institute of Technology

Description

In this paper, we describe KIT’s submission for the IWSLT 2019 shared task on text translation. Our system is based on the transformer model [1] using our in-house implementation. We augment the available training data using back-translation and employ fine-tuning for the final model. For our best results, we used a 12-layer transformer-big config- uration, achieving state-of-the-art results on the WMT2018 test set. We also experiment with student-teacher models to improve performance of smaller models.

Files

IWSLT2019_paper_30.pdf

Files (95.6 kB)

Name Size Download all
md5:5e830364de4b52bbdbe9d55348d8bba2
95.6 kB Preview Download