Published September 8, 2015 | Version v1
Dataset Open

PAN15 Author Identification: Verification


We provide you with a training corpus that comprises a set of author verification problems in several languages/genres. Each problem consists of some (up to five) known documents by a single person and exactly one questioned document. All documents within a single problem instance will be in the same language. However, their genre and/or topic may differ significantly. The document lengths vary from a few hundred to a few thousand words.

The documents of each problem are located in a separate folder, the name of which (problem ID) encodes the language of the documents. The following list shows the available sub-corpora, including their language, type (cross-genre or cross-topic), code, and examples of problem IDs:

Language; Type; Code; Problem IDs
Dutch; Cross-genre; DU; DU001, DU002, DU003, etc.
English; Cross-topic; EN; EN001, EN002, EN003, etc.
Greek; Cross-topic; GR; GR001, GR002, GR003, etc.
Spanish; Cross-genre; SP; SP001, SP002, SP003, etc.

The ground truth data of the training corpus found in the file truth.txt include one line per problem with problem ID and the correct binary answer (Y means the known and the questioned documents are by the same author and N means the opposite). For example:

EN001 N
EN002 Y
EN003 N


Files (6.1 MB)

Additional details


  • Efstathios Stamatatos, Walter Daelemans amd Ben Verhoeven, Patrick Juola, Aurelio López-López, Martin Potthast, and Benno Stein. Overview of the Author Identification Task at PAN 2015. In Linda Cappellato, Nicola Ferro, Gareth Jones, and Eric San Juan, editors, CLEF 2015 Evaluation Labs and Workshop – Working Notes Papers, 8-11 September, Toulouse, France, September 2015. ISSN 1613-0073.
  • Efstathios Stamatatos, Martin Potthast, Francisco Rangel, Paolo Rosso, and Benno Stein. Overview of the PAN/CLEF 2015 Evaluation Lab. In Josiane Mothe et al, editors, Experimental IR Meets Multilinguality, Multimodality, and Interaction. 6th International Conference of the CLEF Initiative (CLEF 2015), pages 518-538, Berlin Heidelberg New York, September 2015. Springer. ISBN 978-3-319-24026-8.