Conference paper Open Access

Controlling the Output Length of Neural Machine Translation

Lakew, Surafel Melaku; Di Gangi, Mattia; Federico, Marcello

MARC21 XML Export

<?xml version='1.0' encoding='UTF-8'?>
<record xmlns="">
  <datafield tag="041" ind1=" " ind2=" ">
    <subfield code="a">eng</subfield>
  <controlfield tag="005">20200120173343.0</controlfield>
  <controlfield tag="001">3524957</controlfield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">Fondazione Bruno Kessler, Trento, Italy &amp; University of Trento, Italy</subfield>
    <subfield code="a">Di Gangi, Mattia</subfield>
  <datafield tag="700" ind1=" " ind2=" ">
    <subfield code="u">Amazon AI - Palo Alto, USA</subfield>
    <subfield code="a">Federico, Marcello</subfield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">276461</subfield>
    <subfield code="z">md5:4741c3e104acfbf97d1380ac6ffd8075</subfield>
    <subfield code="u"></subfield>
  <datafield tag="542" ind1=" " ind2=" ">
    <subfield code="l">open</subfield>
  <datafield tag="260" ind1=" " ind2=" ">
    <subfield code="c">2019-11-02</subfield>
  <datafield tag="909" ind1="C" ind2="O">
    <subfield code="p">openaire</subfield>
    <subfield code="p">user-iwslt2019</subfield>
    <subfield code="o"></subfield>
  <datafield tag="100" ind1=" " ind2=" ">
    <subfield code="u">Fondazione Bruno Kessler, Trento, Italy &amp; University of Trento, Italy</subfield>
    <subfield code="a">Lakew, Surafel Melaku</subfield>
  <datafield tag="245" ind1=" " ind2=" ">
    <subfield code="a">Controlling the Output Length of Neural Machine Translation</subfield>
  <datafield tag="980" ind1=" " ind2=" ">
    <subfield code="a">user-iwslt2019</subfield>
  <datafield tag="540" ind1=" " ind2=" ">
    <subfield code="u"></subfield>
    <subfield code="a">Creative Commons Attribution 4.0 International</subfield>
  <datafield tag="650" ind1="1" ind2="7">
    <subfield code="a">cc-by</subfield>
    <subfield code="2"></subfield>
  <datafield tag="520" ind1=" " ind2=" ">
    <subfield code="a">&lt;p&gt;The recent advances introduced by neural machine translation (NMT) are rapidly expanding the application fields of machine translation, as well as reshaping the quality level to be targeted. In particular, if translations have to fit some given layout, quality should not only be measured in terms of adequacy and fluency, but also length. Exemplary cases are the translation of document files, subtitles, and scripts for dubbing, where the output length should ideally be as close as possible to the length of the input text. This pa-per addresses for the first time, to the best of our knowledge, the problem of controlling the output length in NMT. We investigate two methods for biasing the output length with a transformer architecture: i) conditioning the output to a given target-source length-ratio class and ii) enriching the transformer positional embedding with length information. Our experiments show that both methods can induce the network to generate shorter translations, as well as acquiring inter- pretable linguistic skills.&lt;/p&gt;</subfield>
  <datafield tag="773" ind1=" " ind2=" ">
    <subfield code="n">doi</subfield>
    <subfield code="i">isVersionOf</subfield>
    <subfield code="a">10.5281/zenodo.3524956</subfield>
  <datafield tag="024" ind1=" " ind2=" ">
    <subfield code="a">10.5281/zenodo.3524957</subfield>
    <subfield code="2">doi</subfield>
  <datafield tag="980" ind1=" " ind2=" ">
    <subfield code="a">publication</subfield>
    <subfield code="b">conferencepaper</subfield>
All versions This version
Views 8887
Downloads 7373
Data volume 20.2 MB20.2 MB
Unique views 8079
Unique downloads 6565


Cite as