Poster Open Access

Free-Range Spiderbots!

Boruta, Luc


MARC21 XML Export

<?xml version='1.0' encoding='UTF-8'?>
<record xmlns="http://www.loc.gov/MARC21/slim">
  <leader>00000nam##2200000uu#4500</leader>
  <datafield tag="041" ind1=" " ind2=" ">
    <subfield code="a">eng</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">crawling</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">robots.txt</subfield>
  </datafield>
  <datafield tag="653" ind1=" " ind2=" ">
    <subfield code="a">digital preservation</subfield>
  </datafield>
  <controlfield tag="005">20181009213004.0</controlfield>
  <controlfield tag="001">1453453</controlfield>
  <datafield tag="856" ind1="4" ind2=" ">
    <subfield code="s">696909</subfield>
    <subfield code="z">md5:97186f8ac5ffb0a74bc300f7a69bdfa5</subfield>
    <subfield code="u">https://zenodo.org/record/1453453/files/20181012 — FORCE18 — Free-Range Spiderbots!.pdf</subfield>
  </datafield>
  <datafield tag="542" ind1=" " ind2=" ">
    <subfield code="l">open</subfield>
  </datafield>
  <datafield tag="260" ind1=" " ind2=" ">
    <subfield code="c">2018-10-09</subfield>
  </datafield>
  <datafield tag="909" ind1="C" ind2="O">
    <subfield code="p">openaire</subfield>
    <subfield code="p">user-force2018</subfield>
    <subfield code="o">oai:zenodo.org:1453453</subfield>
  </datafield>
  <datafield tag="100" ind1=" " ind2=" ">
    <subfield code="u">Thunken, Inc.</subfield>
    <subfield code="0">(orcid)0000-0003-0557-1155</subfield>
    <subfield code="a">Boruta, Luc</subfield>
  </datafield>
  <datafield tag="245" ind1=" " ind2=" ">
    <subfield code="a">Free-Range Spiderbots!</subfield>
  </datafield>
  <datafield tag="980" ind1=" " ind2=" ">
    <subfield code="a">user-force2018</subfield>
  </datafield>
  <datafield tag="540" ind1=" " ind2=" ">
    <subfield code="u">http://creativecommons.org/licenses/by/4.0/legalcode</subfield>
    <subfield code="a">Creative Commons Attribution 4.0 International</subfield>
  </datafield>
  <datafield tag="650" ind1="1" ind2="7">
    <subfield code="a">cc-by</subfield>
    <subfield code="2">opendefinition.org</subfield>
  </datafield>
  <datafield tag="520" ind1=" " ind2=" ">
    <subfield code="a">&lt;p&gt;&lt;strong&gt;Free-range what!?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The robots exclusion standard, a.k.a. robots.txt, is used to give instructions as to which resources of a website can be scanned and crawled by bots.&lt;br&gt;
Invalid or overzealous robots.txt files can lead to a loss of important data, breaking archives, search engines, and any app that links or remixes scholarly data.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why should I care?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You care about open access, don&amp;rsquo;t you? This is about open access for bots, which fosters open access for humans.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Mind your manners&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The standard is purely advisory, it relies on the politeness of the bots. Disallowing access to a page doesn&amp;rsquo;t protect it: if it is referenced or linked to, it can be found.&lt;br&gt;
We don&amp;rsquo;t advocate the deletion of robots.txt files. They are a lightweight mechanism to convey crucial information, e.g. the location of sitemaps. We want better robots.txt files.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Bots must be allowed to roam the scholarly web freely&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Metadata harvesting protocols are great, but there is a lot of data, e.g. pricing, recommendations, that they do not capture, and, at the scale of the web, few content providers actually use these protocols.&lt;br&gt;
The web is unstable: content drifts and servers crash, this is inevitable. Lots of copies keep stuff safe, and crawlers are essential in order to maintain and analyze the permanent record of science.&lt;br&gt;
We want to start an informal open collective to lobby publishers, aggregators, and other stakeholders to standardize and minimize their robots.txt files, and other related directives like noindex tags.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Our First Victory&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In September, we noticed that Hindawi prevented polite bots from accessing pages relating to retracted articles and peer-review fraud. Hindawi fixed their robots.txt after we brought the problem to their attention via Twitter. We can fix the web, one domain at a time!&lt;/p&gt;</subfield>
  </datafield>
  <datafield tag="773" ind1=" " ind2=" ">
    <subfield code="n">doi</subfield>
    <subfield code="i">isVersionOf</subfield>
    <subfield code="a">10.5281/zenodo.1453452</subfield>
  </datafield>
  <datafield tag="024" ind1=" " ind2=" ">
    <subfield code="a">10.5281/zenodo.1453453</subfield>
    <subfield code="2">doi</subfield>
  </datafield>
  <datafield tag="980" ind1=" " ind2=" ">
    <subfield code="a">poster</subfield>
  </datafield>
</record>
146
45
views
downloads
All versions This version
Views 146147
Downloads 4545
Data volume 31.4 MB31.4 MB
Unique views 131132
Unique downloads 3838

Share

Cite as