Conference paper Embargoed Access

Addressing Social Bias in Information Retrieval

Jahna Otterbacher

Journalists and researchers alike have claimed that IR systems are socially biased, returning results to users that perpetuate gender
and racial stereotypes. In this position paper, I argue that IR researchers and in particular, evaluation communities such as CLEF, can and should address such concerns. Using as a guide the Principles for Algorithmic Transparency and Accountability recently put forward by the Association for Computing Machinery, I provide examples of techniques for examining social biases in IR systems and in particular, search engines.

This work has been partly supported by the project that has received funding from the European Union's Horizon 2020 research and innovation programme under grant agreement No 739578 (RISE – Call: H2020-WIDESPREAD-01-2016-2017-TeamingPhase2) and the Government of the Republic of Cyprus through the Directorate General for European Programmes, Coordination and Development. This is a pre-print of an article published in Experimental IR Meets Multilinguality, Multimodality, and Interaction 9th International Conference of the CLEF Association, CLEF 2018, Avignon, France, September 10-14, 2018, Proceedings. The final authenticated version is available online at https://www.springer.com/la/book/9783319989310. © Springer Nature Switzerland AG 2018.
Embargoed Access

Files are currently under embargo but will be publicly accessible after September 14, 2019.

  • Epstein, R., Robertson, R.E.: The search engine manipulation effect (SEME) and its possible impact on the outcomes of elections. Proc. Nat. Acad. Sci. 112(33), E4512–E4521 (2015). https://doi.org/10.1073/pnas.1419828112. http://www.pnas. org/content/112/33/E4512

  • Friedman, B., Nissenbaum, H.: Bias in computer systems. ACM Trans. Inf. Syst. (TOIS) 14(3), 330–347 (1996)

  • Kay, M., Matuszek, C., Munson, S.A.: Unequal representation and gender stereotypes in image search results for occupations. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, pp. 3819–3828. ACM (2015)

  • Otterbacher, J., Bates, J., Clough, P.: Competent men and warm women: gender stereotypes and backlash in image search results. In: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, CHI 2017, pp. 6620–6631. ACM, New York (2017). https://doi.org/10.1145/3025453.3025727

  • Otterbacher, J., Checco, A., Demartini, G., Clough, P.: Investigating user perception of gender bias in image search: the role of sexism. In: Proceedings of the 41st International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR-2018). ACM Press, New York (2018)

  • Otterbacher, J.: Crowdsourcing stereotypes: linguistic bias in metadata generated via gwap. In: Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems, CHI 2015, pp. 1955–1964. ACM, New York (2015). https:// doi.org/10.1145/2702123.2702151

  • Otterbacher, J.: Social cues, social biases: stereotypes in annotations on people images. In: Proceedings of the Sixth AAAI Conference on Human Computation and Crowdsourcing (HCOMP-2018). AAAI Press, Palo Alto (2018)

  • Pan, B., Hembrooke, H., Joachims, T., Lorigo, L., Gay, G., Granka, L.: In google we trust: users decisions on rank, position, and relevance. J. Comput. Med. Commun. 12(3), 801–823 (2007). https://doi.org/10.1111/j.1083-6101.2007.00351.x

  • Von Ahn, L., Dabbish, L.: Labeling images with a computer game. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 319–326. ACM (2004)

15
6
views
downloads
Views 15
Downloads 6
Data volume 4.0 MB
Unique views 13
Unique downloads 5

Share

Cite as