Published August 2, 2021 | Version v1
Conference paper Open

Assessing the Syntactic Capabilities of Transformer-based Multilingual Language Models

  • 1. TALN Research Group, Pompeu Fabra University, Barcelona, Spain
  • 2. Catalan Institute for Research and Advanced Studies and Pompeu Fabra University, Barcelona, Spain

Description

Multilingual Transformer-based language models, usually pretrained on more than 100 languages, have been shown to achieve outstanding results in a wide range of crosslingual transfer tasks. However, it remains unknown whether the optimization for different languages conditions the capacity of the models to generalize over syntactic structures, and how languages with syntactic phenomena of different complexity are affected. In this work, we explore the syntactic generalization capabilities of the monolingual and multilingual versions of BERT and RoBERTa. More specifically, we evaluate the syntactic generalization potential of the models on English and Spanish tests, comparing the syntactic abilities of monolingual and multilingual models on the same language (English), and of multilingual models on two different languages (English and Spanish). For English, we use the available SyntaxGym test suite; for Spanish, we introduce SyntaxGymES, a novel ensemble of targeted syntactic tests in Spanish, designed to evaluate the syntactic generalization capabilities of language models through the SyntaxGym online platform.

Notes

https://aclanthology.org/2021.findings-acl.333.pdf

Files

2021.findings-acl.333.pdf

Files (339.7 kB)

Name Size Download all
md5:637a8da8004af63c954d6e4bd289be08
339.7 kB Preview Download

Additional details

Funding

CONNEXIONs – InterCONnected NEXt-Generation Immersive IoT Platform of Crime and Terrorism DetectiON, PredictiON, InvestigatiON, and PreventiON Services 786731
European Commission