Planned intervention: On Wednesday April 3rd 05:30 UTC Zenodo will be unavailable for up to 2-10 minutes to perform a storage cluster upgrade.
Published July 27, 2017 | Version v2
Dataset Open

TokTrack: A Complete Token Provenance and Change Tracking Dataset for the English Wikipedia

  • 1. GESIS - Leibniz Institute for the Social Sciences
  • 2. Karlsruhe Institute of Technology

Description

Fixes in version 1.1 (= Zenodo's "version 2")

*In 20161101-revisions-part1-12-1728.csv, missing first data line is added.

*In Current_content and Deleted_content files, some token values ('str' column) which contain regular quotes ('"') are fixed.

*In Current_content and Deleted_content files, some wrong revision ID values for 'origin_rev_id', 'in' and 'out' columns are fixed.

 ------

This dataset contains every instance of all tokens (≈ words) ever written in undeleted, non-redirect English Wikipedia articles until October 2016, in total 13,545,349,787 instances. Each token is annotated with (i) the article revision it was originally created in, and (ii) lists with all the revisions in which the token was ever deleted and (potentially) re-added and re-deleted from its article, enabling a complete and straightforward tracking of its history.

This data would be exceedingly hard to create by an average potential user as it is (i) very expensive to compute and as (ii) accurately tracking the history of each token in revisioned documents is a non-trivial task. 
Adapting a state-of-the-art algorithm, we have produced a dataset that allows for a range of analyses and metrics, already popular in research and going beyond, to be generated on complete-Wikipedia scale; ensuring quality and allowing researchers to forego expensive text-comparison computation, which so far has hindered scalable usage.

This dataset, its creation process and use cases are described in a dedicated dataset paper of the same name, published at the ICWSM 2017 conference. In this paper, we show how this data enables, on token level, computation of provenance, measuring survival of content over time, very detailed conflict metrics, and fine-grained interactions of editors like partial reverts, re-additions and other metrics.

Tokenization used: https://gist.github.com/faflo/3f5f30b1224c38b1836d63fa05d1ac94

Toy example for how the token metadata is generated: 
https://gist.github.com/faflo/8bd212e81e594676f8d002b175b79de8

Be sure to read the ReadMe.txt or - even more detailed - the supporting paper which is referenced under "related identifiers".

Notes

Please cite 10.5281/zenodo.789289 for all versions of this dataset, which will always resolve to the latest.

Files

dataset_readme.txt

Files (68.9 GB)

Name Size Download all
md5:8ca900ad3932434e24039901b55c58d0
1.6 GB Download
md5:5bd0f0e8ff8334aac02a8297c8baf61c
2.1 GB Download
md5:4ca06f0ccf24ff75046f8ef40b260aaa
2.2 GB Download
md5:9f896a2447f95aa0cda91f00bd44669a
2.3 GB Download
md5:ee988500d7731b57c1f65d7ff2fd3ffd
2.4 GB Download
md5:8c67e68ac1c252c02f4366cbd44a1fb6
2.5 GB Download
md5:6f082f6e57d03acc2129a22981ee6cd6
2.7 GB Download
md5:f6445d0b099dac02d9f1926e778711a1
2.7 GB Download
md5:d94e8c44658d7f9a1878355a48f3adad
2.9 GB Download
md5:83d3cfbf6b0d3a21a7431667444de2e5
3.0 GB Download
md5:7e06d1a4a2e4467ab056e6ed719b955d
2.0 GB Download
md5:579468afdf3848406f090c0b56809bcd
3.1 GB Download
md5:8128ec72ed59a926a11cb4bcc8515cb9
3.0 GB Download
md5:d2720327ed6afb13c996072bda08fe64
3.8 GB Download
md5:9873fbdbf5cb41c6ce77b489cd52464c
3.3 GB Download
md5:af36d2cc05c4ade7df1e3fc5b91dbe28
3.1 GB Download
md5:f2eb903001ff4737698a434a0141521a
2.9 GB Download
md5:dfde922adedd029793ae4112737a48cd
2.7 GB Download
md5:3ece1cfa4326c624d043aebf280fe7f5
2.5 GB Download
md5:32984e79e729a63b53f5558b7d7b8ff2
2.2 GB Download
md5:d4d2fcc456cc275eb136a491f2ede1b9
2.1 GB Download
md5:4f2a91ef02f2201b5ca08bb677101272
1.9 GB Download
md5:36f4189a308b53d20e355ecf759f1768
1.7 GB Download
md5:1b7d2f6e670b4f896fbd12cc85cbc120
3.4 GB Download
md5:e00115390203eef125b045c93e41dfe3
1.4 GB Download
md5:c80884b1a83af575d66c033a4e0dcec3
882.1 MB Download
md5:222d935035ba5aec699c0210802668d9
4.9 kB Preview Download
md5:7ba169bad9fd5c6778cf16f11707976c
4.6 GB Download

Additional details

Related works

References

  • Flöck, Fabian, and Acosta, Maribel. "WikiWho: Precise and efficient attribution of authorship of revisioned content." Proceedings of the 23rd international conference on World Wide Web. ACM, 2014.
  • Fabian Flöck, Kenan Erdogan, Maribel Acosta. "TokTrack: A Complete Token Provenance and Change Tracking Dataset for the English Wikipedia." Proceedings of ICWSM2017 (to appear). Preprint: https://arxiv.org/abs/1703.08244