Conference paper Open Access

Leveraging Transformer Self Attention Encoder for Crisis Event Detection in Short Texts

Pantelis Kyriakidis; Despoina Chatzakou; Theodora Tsikrika; Stefanos Vrochidis; Ioannis Kompatsiaris

Analyzing content generated on social media has proven to be a powerful tool for early detection of crisis-related events. Such an analysis may allow for timely action, mitigating or even preventing altogether the effects of a crisis. However, the high noise levels in short texts present in microblogging platforms, combined with the limited publicly available datasets have rendered the task difficult. Here, we propose deep learning models based on a transformer self-attention encoder, which is capable of detecting event-related parts in a text, while also minimizing potential noise levels. Our models efficacy is shown by experimenting with CrisisLexT26, achieving up to 81.6% f1-score and 92.7% AUC.

This preprint has not undergone peer review (when applicable) or any post-submission improvements or corrections. The Version of Record of this contribution will be published in Proceedings of the 44th European Conference on Information Retrieval. (link & DOI will be added when they become available)
Files (372.4 kB)
Name Size
Leveraging Transformer Self Attention Encoder.pdf
md5:bdcb24641493d1f4408f17285c5189bd
372.4 kB Download
58
55
views
downloads
All versions This version
Views 5858
Downloads 5555
Data volume 20.5 MB20.5 MB
Unique views 5353
Unique downloads 4444

Share

Cite as