Journal article Embargoed Access

Word-Class Embeddings for Multiclass Text Classification

Moreo, Alejandro; Esuli, Andrea; Sebastiani, Fabrizio

Pre-trained word embeddings encode general word semantics and lexical regularities of natural language, and have proven useful across many NLP tasks, including word sense disambiguation, machine translation, and sentiment analysis, to name a few. In supervised tasks such as multiclass text classification (the focus of this article) it seems appealing to enhance word representations with ad-hoc embeddings that encode task-specific information. We propose (supervised) word-class embeddings (WCEs), and show that, when concatenated to (unsupervised) pre-trained word embeddings, they substantially facilitate the training of deep-learning models in multiclass classification by topic. We show empirical evidence that WCEs yield a consistent improvement in multiclass classification accuracy, using six popular neural architectures and six widely used and publicly available datasets for multi- class text classification. One further advantage of this method is that it is conceptually simple and straightforward to implement. Our code that implements WCEs is publicly available at https://github.com/AlexMoreo/ word-class-embeddings.

Embargoed Access

Files are currently under embargo but will be publicly accessible after January 31, 2022.

47
5
views
downloads
All versions This version
Views 4747
Downloads 55
Data volume 48.9 MB48.9 MB
Unique views 3838
Unique downloads 44

Share

Cite as