Published July 11, 2022 | Version v1
Conference paper Open

On The Nature of Misidentification With Privacy Preserving Algorithms

  • 1. Vienna University of Technology, Austria
  • 2. University of Alicante, Spain


The ubiquitous use of computer vision and camera surveillance makes it increasingly easy to automatically recognize persons in visuals. In this context, obfuscation methods like blurring and pixelation can impart privacy by preventing facial recognition. But even in cases where these techniques successfully obscure the subject’s identity, the question of who is recognized in their stead and what influences this misidentification is still open. As facial recognition is an area which is particularly prone to demographic bias, we analyse misidentifications along the lines of race and gender. We show that persons are most often mistaken for someone of their own gender. However, in terms of racial bias, white people tend to be under-represented among the misidentifications.


On the nature of misidentification with privacy preserving algorithms.pdf

Files (481.9 kB)

Additional details


visuAAL – Privacy-Aware and Acceptable Video-Based Technologies and Services for Active and Assisted Living 861091
European Commission