Conference paper Open Access

Addressing Social Bias in Information Retrieval

Jahna Otterbacher

Dublin Core Export

<?xml version='1.0' encoding='utf-8'?>
<oai_dc:dc xmlns:dc="" xmlns:oai_dc="" xmlns:xsi="" xsi:schemaLocation="">
  <dc:creator>Jahna Otterbacher</dc:creator>
  <dc:description>Journalists and researchers alike have claimed that IR systems are socially biased, returning results to users that perpetuate gender
and racial stereotypes. In this position paper, I argue that IR researchers and in particular, evaluation communities such as CLEF, can and should address such concerns. Using as a guide the Principles for Algorithmic Transparency and Accountability recently put forward by the Association for Computing Machinery, I provide examples of techniques for examining social biases in IR systems and in particular, search engines.</dc:description>
  <dc:description>This work has received funding from the European Union's Horizon 2020 Research and Innovation Programme under Grant Agreement  No 739578 and the Government of the Republic of Cyprus through the Directorate General for European Programmes, Coordination and Development.

This is a pre-print of an article published in Experimental IR Meets Multilinguality, Multimodality, and Interaction 9th International Conference of the CLEF Association, CLEF 2018, Avignon, France, September 10-14, 2018, Proceedings. The final authenticated version is available online at © Springer Nature Switzerland AG 2018.</dc:description>
  <dc:publisher>Springer Nature Switzerland AG</dc:publisher>
  <dc:subject>Social biases</dc:subject>
  <dc:subject>Ranking algorithms</dc:subject>
  <dc:title>Addressing Social Bias in Information Retrieval</dc:title>
Views 71
Downloads 73
Data volume 49.1 MB
Unique views 64
Unique downloads 65


Cite as