Conference paper Open Access

Exploring Transformers for Ranking Portuguese Semantic Relations

Hugo Gonçalo Oliveira

We explored transformer-based language models for ranking instances of Portuguese lexico-semantic relations. Weights were based on the likelihood of natural language sequences that transmitted the relation instances, and expectations were that they would be useful for filtering out noisier instances. However, after analysing the weights, no strong conclusions were taken. They are not correlated with redundancy, but are lower for instances with longer and more specific arguments, which may nevertheless be a consequence of their sensitivity to the frequency of such arguments. They did also not reveal to be useful when computing word similarity with network embeddings. Despite the negative results, we see the reported experiments and insights as another contribution for better understanding transformer language models like BERT and GPT, and we make the weighted instances publicly available for further research.

Files (283.8 kB)
Name Size
283.8 kB Download
All versions This version
Views 1515
Downloads 1111
Data volume 3.1 MB3.1 MB
Unique views 1515
Unique downloads 1111


Cite as