Published September 14, 2020 | Version v1
Conference paper Open

Crowdwork as a Snapshot in Time: Image Annotation Tasks during a Pandemic


While crowdsourcing provides a convenient solution for tapping into human intelligence, a concern is the bias inherent in the data collected. Events related to the COVID-19 pandemic had an impact on people globally, and crowdworkers were no exception. Given the evidence concerning mood and stress on work, we explore how temporal events might affect crowdsourced data. We replicated an image annotation task conducted in 2018, in which workers describe people images. We expected 2020 annotations to contain more references to health, as compared to 2018 data. Overall, we find no evidence that health-related tags were used more often in 2020, but instead we find a significant increase in the use of tags related to weight (e.g., fat, chubby, overweight). This result, coupled with the “stay at home” act in effect in 2020, illustrate how crowdwork is impacted by temporal events.



Files (95.2 kB)

Additional details


CyCAT – Cyprus Center for Algorithmic Transparency 810105
European Commission
RISE – Research Center on Interactive Media, Smart System and Emerging Technologies 739578
European Commission