Published April 19, 2021 | Version v1
Conference paper Open

On the Evolution of Syntactic Information Encoded by BERT's Contextualized Representations

  • 1. TALN Research Group, Pompeu Fabra University, Barcelona, Spain
  • 2. Amazon AI
  • 3. Pompeu Fabra University and Catalan Institute for Research and Advanced Studies (ICREA), Barcelona, Spain

Description

The adaptation of pretrained language models to solve supervised tasks has become a baseline in NLP, and many recent works have focused on studying how linguistic information is encoded in the pretrained sentence representations. Among other information, it has been shown that entire syntax trees are implicitly embedded in the geometry of such models. As these models are often fine-tuned, it becomes increasingly important to understand how the encoded knowledge evolves along the fine-tuning. In this paper, we analyze the evolution of the embedded syntax trees along the fine-tuning process of BERT for six different tasks, covering all levels of the linguistic structure. Experimental results show that the encoded syntactic information is forgotten (PoS tagging), reinforced (dependency and constituency parsing) or preserved (semantics related tasks) in different ways along the finetuning process depending on the task.

Notes

https://aclanthology.org/2021.eacl-main.191.pdf

Files

2021.eacl-main.191.pdf

Files (1.8 MB)

Name Size Download all
md5:5d17079447a28b1f6d733871db7705bc
1.8 MB Preview Download