Published August 23, 2017 | Version v1
Conference paper Open

Keeping linked open data caches up-to-date by predicting the life-time of RDF triples

  • 1. Kyoto University Library
  • 2. Scherp Kiel University and ZBW - Leibniz Information Centre for Economics, Kiel, Germany

Description

Many Linked Open Data applications require fresh copies of RDF data at their local repositories. Since RDF documents constantly change and those changes are not automatically propagated to the LOD applications, it is important to regularly visit the RDF documents to refresh the local copies and keep them up-to-date. For this purpose, crawling strategies determine which RDF documents should be preferentially fetched. Traditional crawling strategies rely only on how an RDF document has been modified in the past. In contrast, we predict on the triple level whether a change will occur in the future. We use the weekly snapshots of the DyLDO dataset as well as the monthly snapshots of the Wikidata dataset. First, we conduct an in-depth analysis of the life span of triples in RDF documents. Through the analysis, we identify which triples are stable and which are ephemeral. We introduce different features based on the triples and apply a simple but effective linear regression model. Second, we propose a novel crawling strategy based on the linear regression model. We conduct two experimental setups where we vary the amount of available bandwidth as well as iteratively observe the quality of the local copies over time. The results demonstrate that the novel crawling strategy outperforms the state of the art in both setups.

Files

wi17-crawling.pdf

Files (946.8 kB)

Name Size Download all
md5:803ccd2751aef8f5172e45df78478640
946.8 kB Preview Download

Additional details

Funding

MOVING – Training towards a society of data-savvy information professionals to enable open leadership innovation 693092
European Commission