Crowdwork as a Snapshot in Time: Image Annotation Tasks during a Pandemic
While crowdsourcing provides a convenient solution for tapping into human intelligence, a concern is the bias inherent in the data collected. Events related to the COVID-19 pandemic had an impact on people globally, and crowdworkers were no exception. Given the evidence concerning mood and stress on work, we explore how temporal events might affect crowdsourced data. We replicated an image annotation task conducted in 2018, in which workers describe people images. We expected 2020 annotations to contain more references to health, as compared to 2018 data. Overall, we find no evidence that health-related tags were used more often in 2020, but instead we find a significant increase in the use of tags related to weight (e.g., fat, chubby, overweight). This result, coupled with the “stay at home” act in effect in 2020, illustrate how crowdwork is impacted by temporal events.