Conference paper Open Access

Leveraging Transformer Self Attention Encoder for Crisis Event Detection in Short Texts

Pantelis Kyriakidis; Despoina Chatzakou; Theodora Tsikrika; Stefanos Vrochidis; Ioannis Kompatsiaris

Analyzing content generated on social media has proven to be a powerful tool for early detection of crisis-related events. Such an analysis may allow for timely action, mitigating or even preventing altogether the effects of a crisis. However, the high noise levels in short texts present in microblogging platforms, combined with the limited publicly available datasets have rendered the task difficult. Here, we propose deep learning models based on a transformer self-attention encoder, which is capable of detecting event-related parts in a text, while also minimizing potential noise levels. Our models efficacy is shown by experimenting with CrisisLexT26, achieving up to 81.6% f1-score and 92.7% AUC.

This preprint has not undergone peer review (when applicable) or any post-submission improvements or corrections. The Version of Record of this contribution is published in the 44th European Conference on Information Retrieval, and is available online at https://doi.org/10.1007/978-3-030-99739-7_19
Files (372.4 kB)
Name Size
Leveraging Transformer Self Attention Encoder.pdf
md5:bdcb24641493d1f4408f17285c5189bd
372.4 kB Download
168
151
views
downloads
All versions This version
Views 168168
Downloads 151151
Data volume 56.2 MB56.2 MB
Unique views 145145
Unique downloads 134134

Share

Cite as