Conference paper Open Access
Nguyen, Toan Q.; Salazar, Julian
<?xml version='1.0' encoding='utf-8'?> <oai_dc:dc xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:oai_dc="http://www.openarchives.org/OAI/2.0/oai_dc/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.openarchives.org/OAI/2.0/oai_dc/ http://www.openarchives.org/OAI/2.0/oai_dc.xsd"> <dc:creator>Nguyen, Toan Q.</dc:creator> <dc:creator>Salazar, Julian</dc:creator> <dc:date>2019-11-02</dc:date> <dc:description>We evaluate three simple, normalization-centric changes to improve Transformer training. First, we show that pre-norm residual connections (PRENORM) and smaller initializations enable warmup-free, validation-based training with large learning rates. Second, we propose l2 normalization with a single scale parameter (SCALENORM) for faster training and better performance. Finally, we reaffirm the effectiveness of normalizing word embeddings to a fixed length (FIXNORM). On five low-resource translation pairs from TED Talks-based corpora, these changes always converge, giving an average +1.1 BLEU over state-of-the-art bilingual baselines and a new 32.8 BLEU on IWSLT '15 English-Vietnamese. We ob- serve sharper performance curves, more consistent gradient norms, and a linear relationship between activation scaling and decoder depth. Surprisingly, in the high-resource setting (WMT '14 English-German), SCALENORM and FIXNORM remain competitive but PRENORM degrades performance.</dc:description> <dc:identifier>https://zenodo.org/record/3525484</dc:identifier> <dc:identifier>10.5281/zenodo.3525484</dc:identifier> <dc:identifier>oai:zenodo.org:3525484</dc:identifier> <dc:language>eng</dc:language> <dc:relation>doi:10.5281/zenodo.3525483</dc:relation> <dc:relation>url:https://zenodo.org/communities/iwslt2019</dc:relation> <dc:rights>info:eu-repo/semantics/openAccess</dc:rights> <dc:rights>https://creativecommons.org/licenses/by/4.0/legalcode</dc:rights> <dc:title>Transformers without Tears: Improving the Normalization of Self-Attention</dc:title> <dc:type>info:eu-repo/semantics/conferencePaper</dc:type> <dc:type>publication-conferencepaper</dc:type> </oai_dc:dc>
All versions | This version | |
---|---|---|
Views | 1,037 | 1,037 |
Downloads | 644 | 644 |
Data volume | 222.8 MB | 222.8 MB |
Unique views | 888 | 888 |
Unique downloads | 576 | 576 |