Conference paper Open Access

Controlling the Output Length of Neural Machine Translation

Lakew, Surafel Melaku; Di Gangi, Mattia; Federico, Marcello

DataCite XML Export

<?xml version='1.0' encoding='utf-8'?>
<resource xmlns:xsi="" xmlns="" xsi:schemaLocation="">
  <identifier identifierType="DOI">10.5281/zenodo.3524957</identifier>
      <creatorName>Lakew, Surafel Melaku</creatorName>
      <givenName>Surafel Melaku</givenName>
      <affiliation>Fondazione Bruno Kessler, Trento, Italy &amp; University of Trento, Italy</affiliation>
      <creatorName>Di Gangi, Mattia</creatorName>
      <familyName>Di Gangi</familyName>
      <affiliation>Fondazione Bruno Kessler, Trento, Italy &amp; University of Trento, Italy</affiliation>
      <creatorName>Federico, Marcello</creatorName>
      <affiliation>Amazon AI - Palo Alto, USA</affiliation>
    <title>Controlling the Output Length of Neural Machine Translation</title>
    <date dateType="Issued">2019-11-02</date>
  <resourceType resourceTypeGeneral="Text">Conference paper</resourceType>
    <alternateIdentifier alternateIdentifierType="url"></alternateIdentifier>
    <relatedIdentifier relatedIdentifierType="DOI" relationType="IsVersionOf">10.5281/zenodo.3524956</relatedIdentifier>
    <relatedIdentifier relatedIdentifierType="URL" relationType="IsPartOf"></relatedIdentifier>
    <rights rightsURI="">Creative Commons Attribution 4.0 International</rights>
    <rights rightsURI="info:eu-repo/semantics/openAccess">Open Access</rights>
    <description descriptionType="Abstract">&lt;p&gt;The recent advances introduced by neural machine translation (NMT) are rapidly expanding the application fields of machine translation, as well as reshaping the quality level to be targeted. In particular, if translations have to fit some given layout, quality should not only be measured in terms of adequacy and fluency, but also length. Exemplary cases are the translation of document files, subtitles, and scripts for dubbing, where the output length should ideally be as close as possible to the length of the input text. This pa-per addresses for the first time, to the best of our knowledge, the problem of controlling the output length in NMT. We investigate two methods for biasing the output length with a transformer architecture: i) conditioning the output to a given target-source length-ratio class and ii) enriching the transformer positional embedding with length information. Our experiments show that both methods can induce the network to generate shorter translations, as well as acquiring inter- pretable linguistic skills.&lt;/p&gt;</description>
All versions This version
Views 129128
Downloads 9797
Data volume 26.8 MB26.8 MB
Unique views 113112
Unique downloads 8686


Cite as