Zenodo.org will be unavailable for 2 hours on September 29th from 06:00-08:00 UTC. See announcement.

Conference paper Open Access

Leveraging Transformer Self Attention Encoder for Crisis Event Detection in Short Texts

Pantelis Kyriakidis; Despoina Chatzakou; Theodora Tsikrika; Stefanos Vrochidis; Ioannis Kompatsiaris

Citation Style Language JSON Export

  "publisher": "Zenodo", 
  "DOI": "10.5281/zenodo.6036739", 
  "language": "eng", 
  "title": "Leveraging Transformer Self Attention Encoder for Crisis Event Detection in Short Texts", 
  "issued": {
    "date-parts": [
  "abstract": "<p>Analyzing content generated on social media has proven to be a powerful tool for early detection of crisis-related events. Such an analysis may allow for timely action, mitigating or even preventing altogether the effects of a crisis. However, the high noise levels in short texts present in microblogging platforms, combined with the limited publicly available datasets have rendered the task difficult. Here, we propose deep learning models based on a transformer self-attention encoder, which is capable of detecting event-related parts in a text, while also minimizing potential noise levels. Our models efficacy is shown by experimenting with CrisisLexT26, achieving up to 81.6% f1-score and 92.7% AUC.</p>", 
  "author": [
      "family": "Pantelis Kyriakidis"
      "family": "Despoina Chatzakou"
      "family": "Theodora Tsikrika"
      "family": "Stefanos Vrochidis"
      "family": "Ioannis Kompatsiaris"
  "id": "6036739", 
  "note": "This preprint has not undergone peer review (when applicable) or any post-submission improvements or corrections. The Version of Record of this contribution is published in the 44th European Conference on Information Retrieval, and is available online at https://doi.org/10.1007/978-3-030-99739-7_19", 
  "event-place": "Stavanger, Norway", 
  "type": "paper-conference", 
  "event": "44th European Conference on Information Retrieval (ECIR'22)"
All versions This version
Views 177177
Downloads 172172
Data volume 64.1 MB64.1 MB
Unique views 153153
Unique downloads 155155


Cite as