Published November 6, 2019 | Version v1
Dataset Open

PAN19 Authorship Analysis: Cross-Domain Authorship Attribution

  • 1. University of Antwerp
  • 2. University of the Aegean
  • 3. Leipzig University
  • 4. Bauhaus-Universität Weimar

Description

Authorship attribution is an important problem in information retrieval and computational linguistics but also in applied areas such as law and journalism where knowing the author of a document (such as a ransom note) may enable e.g. law enforcement to save lives. The most common framework for testing candidate algorithms is the closed-set attribution task: given a sample of reference documents from a restricted and finite set of candidate authors, the task is to determine the most likely author of a previously unseen document of unknown authorship. This task may be quite challenging in cross-domain conditions, when documents of known and unknown authorship come from different domains (e.g., thematic area, genre). In addition, it is often more realistic to assume that the true author of a disputed document is not necessarily included in the list of candidates.

Fanfiction refers to fictional forms of literature which are nowadays produced by admirers ('fans') of a certain author (e.g. J.K. Rowling), novel ('Pride and Prejudice'), TV series (Sherlock Holmes), etc. The fans heavily borrow from the original work's theme, atmosphere, style, characters, story world etc. to produce new fictional literature, i.e. the so-called fanfics. This is why fanfiction is also known as transformative literature and has generated a number of controversies in recent years related to the intellectual rights property of the original authors (cf. plagiarism). Fanfiction, however, is typically produced by fans without any explicit commercial goals. The publication of fanfics typically happens online, on informal community platforms that are dedicated to making such literature accessible to a wider audience (e.g. fanfiction.net). The original work of art or genre is typically refered to as a fandom.

This edition of PAN focuses on cross-domain attribution in fanfiction, a task that can be more accurately described as cross-fandom attribution in fanfiction. In more detail, all documents of unknown authorship are fanfics of the same fandom (target fandom) while the documents of known authorship by the candidate authors are fanfics of several fandoms (other than the target-fandom). In contrast to the PAN-2018 edition of this task, we focus on open-set attribution conditions, namely the true author of a text in the target domain is not necessarily included in the list of candidate authors.

Each problem consists of a set of known fanfics by each candidate author and a set of unknown fanfics located in separate folders. The file problem-info.json that can be found in the main folder of each problem, shows the name of folder of unknown documents and the list of names of candidate author folders.

The fanfics of known authorship belong to several fandoms (excluding the target fandom). The file fandom-info.json (it can be found in the main folder of each problem) provides information about the fandom of all fanfics of known authorsihp, as follows.

The true author of each unknown document can be seen in the file ground-truth.json, also found in the main folder of each problem. Note that all unknown documents that are not written by any of the candidate authors belong to the <UNK> class.

In addition, to handle a collection of such problems, the file collection-info.json includes all relevant information. In more detail, for each problem it lists its main folder, the language (either "en""fr""it", or "sp"), and the encoding (always UTF-8) of documents.

Files

pan19-cross-domain-authorship-attribution-training-dataset-2019-01-23.zip

Additional details

References

  • Mike Kestemont, Efstathios Stamatatos, Enrique Manjavacas, Walter Daelemans, Martin Potthast, and Benno Stein. Overview of the Cross-domain Authorship Attribution Task at PAN 2019. In Linda Cappellato, Nicola Ferro, David E. Losada, and Henning Müller, editors, CLEF 2019 Labs and Workshops, Notebook Papers, September 2019. CEUR-WS.org.