CITREC - Open Evaluation Framework for Citation-based and Text-based Similarity Measures

This document gives an overview of the framework's components and capabilities.
For the latest information on CITREC, visit https://purl.org/citrec

CITREC Overview Paper: (Please use this paper to cite CITREC):

B. Gipp, N. Meuschke, and M. Lipinski,
“CITREC: An Evaluation Framework for Citation-Based Similarity Measures based on TREC Genomics and PubMed Central,”
in Proceedings of the iConference 2015, Newport Beach, California, 2015.
DOI

Summary

CITREC prepares the data of two formerly separate collections for a citation-based analysis and provides the tools necessary for performing evaluations of similarity measures. The first collection is the PubMed Central Open Access Subset (PMC OAS), the second is the collection used for the Genomics Tracks at the Text REtrieval Conferences (TREC) ’06 and ’07 (overview paper for the TREC Gen collection).

CITREC extends the PMC OAS and TREC Genomics collections by providing:

  1. citation and reference information that includes the position of in-text citations for documents in both collections;
  2. code and pre-computed scores for 35 citation-based and text-based similarity measures;
  3. two gold standards based on Medical Subject Headings (MeSH) descriptors and the relevance feedback gathered for the TREC Genomics collection;
  4. a web-based system (Literature Recommendation Evaluator – LRE) that allows evaluating similarity measures on their ability to identify documents that are relevant to user-defined information needs;
  5. tools to statistically analyze and compare the scores that individual similarity measures yield.

Demo System

The demo database (User: citrec_demo / Password: citrec) allows you to get a first impression of the data that CITREC offers and the kind of analysis the framework allows performing.

This Excel spreadsheet exemplifies a possible evaluation using CITREC data. The spreadsheet compares the scores calculated using different similarity measures dependent on the maximum Co-Citation score (i).

Documentation

  1. Database Overview and Tutorial explaining the structure of the CITREC database and demonstrating the usage of the demo system.
  2. Overview of Similarity Tables listing the similarity measures included in the CITREC framework and explaining the naming conventions for the database tables that contain the similarity scores calculated using the individual measures.
  3. Parser Documentation explaining the procedures for data extraction and cleaning.
  4. LRE Documentation describing the web-based SciPlore Literature Recommendation Evaluator, which allows surveys to gather relevance feedback and establish gold standard datasets.

Data

PubMed Central Open Access Subset

TREC Genomics collection

Source Code

Analysis Code

The code inlcudes:

  1. parsers for the PMC OAS and the TREC Genomics collection as well as tools to retrieve MeSH and article metadata from NCBI resources (package org.sciplore.citrec.dataimport)
  2. tools to statistically evaluate retrieval results using a top-k or a rank-based analysis (package org.sciplore.citrec.eval)
  3. implementations of similarity measures and code to calculate the MeSH-based gold standard (package org.sciplore.citrec.sim)

LRE Code

The source code for the Literature Recommendation Evaluator (LRE) uses the symfony (v. 2) PHP framework.

Contribute

CITREC is an open source project published under the Gnu Public License (GPL) version 2. We warmly invite you to contribute to the continuous development of the framework by sharing results and resources related to CITREC.

If you have performed an evaluation using CITREC, developed a similarity measure, a parser, or any other tool that you would like to share, we would be happy to acknowledge and share your work on this page. If you are interested in making your resources available through this page, please contact us at n@meuschke.org.

Document Collections and Metadata

Below, we link to the sources of full texts and metadata that we combined, processed and enhanced as part of the CITREC framework. Please observe the individual licenses of the publishers!

Related Projects

Acknowledgements

We thank everyone contributing to the creation of the TREC Genomics test-collection. Without this great work, the realization of the CITREC framework would not have been possible.

Contact

If you experience any problems or would like to contribute to this project, please send us an email:
n@meuschke.org