Published September 14, 2018 | Version Accepted pre-print
Conference paper Open

Addressing Social Bias in Information Retrieval

  • 1. Open University of Cyprus, Nicosia,Cyprus and Research Centre on Interactive Media Smart Systems and Emerging Technologies, Nicosia, Cyprus

Description

Journalists and researchers alike have claimed that IR systems are socially biased, returning results to users that perpetuate gender
and racial stereotypes. In this position paper, I argue that IR researchers and in particular, evaluation communities such as CLEF, can and should address such concerns. Using as a guide the Principles for Algorithmic Transparency and Accountability recently put forward by the Association for Computing Machinery, I provide examples of techniques for examining social biases in IR systems and in particular, search engines.

Notes

This work has received funding from the European Union's Horizon 2020 Research and Innovation Programme under Grant Agreement No 739578 and the Government of the Republic of Cyprus through the Directorate General for European Programmes, Coordination and Development. This is a pre-print of an article published in Experimental IR Meets Multilinguality, Multimodality, and Interaction 9th International Conference of the CLEF Association, CLEF 2018, Avignon, France, September 10-14, 2018, Proceedings. The final authenticated version is available online at https://www.springer.com/la/book/9783319989310. © Springer Nature Switzerland AG 2018.

Files

Otterbacher-CLEF-2018.pdf

Files (672.1 kB)

Name Size Download all
md5:15d6651b737d2624d8cdcdcc9b6cec37
672.1 kB Preview Download

Additional details

Funding

RISE – Research Center on Interactive Media, Smart System and Emerging Technologies 739578
European Commission

References

  • Epstein, R., Robertson, R.E.: The search engine manipulation effect (SEME) and its possible impact on the outcomes of elections. Proc. Nat. Acad. Sci. 112(33), E4512–E4521 (2015). https://doi.org/10.1073/pnas.1419828112. http://www.pnas. org/content/112/33/E4512
  • Friedman, B., Nissenbaum, H.: Bias in computer systems. ACM Trans. Inf. Syst. (TOIS) 14(3), 330–347 (1996)
  • Kay, M., Matuszek, C., Munson, S.A.: Unequal representation and gender stereotypes in image search results for occupations. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 3819–3828. ACM (2015)
  • Otterbacher, J.: Crowdsourcing stereotypes: linguistic bias in metadata generated via gwap. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI 2015, pp. 1955–1964. ACM, New York (2015). https:// doi.org/10.1145/2702123.2702151
  • Otterbacher, J.: Social cues, social biases: stereotypes in annotations on people images. In: Proceedings of the Sixth AAAI Conference on Human Computation and Crowdsourcing (HCOMP-2018). AAAI Press, Palo Alto (2018)
  • Otterbacher, J., Bates, J., Clough, P.: Competent men and warm women: gender stereotypes and backlash in image search results. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, CHI 2017, pp. 6620–6631. ACM, New York (2017). https://doi.org/10.1145/3025453.3025727
  • Otterbacher, J., Checco, A., Demartini, G., Clough, P.: Investigating user perception of gender bias in image search: the role of sexism. In: Proceedings of the 41st International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR-2018). ACM Press, New York (2018)
  • Pan, B., Hembrooke, H., Joachims, T., Lorigo, L., Gay, G., Granka, L.: In google we trust: users decisions on rank, position, and relevance. J. Comput. Med. Commun. 12(3), 801–823 (2007). https://doi.org/10.1111/j.1083-6101.2007.00351.x
  • Von Ahn, L., Dabbish, L.: Labeling images with a computer game. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 319–326. ACM (2004)