Published November 2, 2019 | Version v1
Conference paper Open

Analysis of Positional Encodings for Neural Machine Translation

  • 1. Human Language Technology and Pattern Recognition Group, RWTH Aachen University, Germany

Description

In this work we analyze and compare the behavior of the Transformer architecture when using different positional encoding methods. While absolute and relative positional encoding perform equally strong overall, we show that relative positional encoding is vastly superior (4.4% to 11.9% BLEU) when translating a sentence that is longer than any observed training sentence. We further propose and analyze variations of relative positional encoding and observe that the number of trainable parameters can be reduced without a performance loss, by using fixed encoding vectors or by removing some of the positional encoding vectors.

Files

IWSLT2019_paper_21.pdf

Files (485.7 kB)

Name Size Download all
md5:a2422d4b74a9634b2567ac4e9f630065
485.7 kB Preview Download