PHyMAtt (Personalised Hygienic Mask Attacks)
Description
This dataset was used to perform the experiments reported in the IJCB2023 paper "Can personalised hygienic masks be used to attack face recognition systems?".
The dataset consists of face videos captured using the ‘selfie’ cameras of five different smartphones : Apple iPhone 12, Apple iPhone 6s, Xiaomi Redmi 6 Pro, Xiaomi Redmi 9A and Samsung Galaxy S9. The dataset contains :
- Bona-fide face videos: 1400 videos of bona-fide (real, non-attack) faces. In total, there are 70 identities (data subjects). Each video is 10 seconds long, where for the first 5 seconds the data subject was required to stay still and look at the camera, then for the last 5 seconds the subject was asked to turn their head from one side to the other (such that profile views could be captured). The videos were acquired indoors, under normal office lighting conditions. The data subjects were volunteers, who were required to be present during two recording sessions, which on average were separated by about three weeks. In each recording session, the volunteers were asked to record a video of their own face using the front (i.e., selfie) camera of each of the five smartphones mentioned earlier. The face data was additionally captured while the data subjects wore plain (not personalised) hygienic masks, to simulate the scenario where face recognition might need to be performed on a masked face (e.g., during a pandemic like COVID-19).
-
Attacks:
-
Personalised hygienic mask attacks: Video recordings of an impostor wearing personalised hygienic masks (one at a time), on which the bottom part of each data subject’s face is printed. Please note that the dataset contains 350 personalised hygienic mask attack videos, whereas the IJCB2023 paper mentioned 345 videos. This is because, for the experiments reported in the paper, we excluded the videos of the attacker wearing their own hygienic mask (since the "attacker" was one of the 70 data subjects).
-
Print attacks: 1400 video recordings of the data subjects’ face photos printed on A4 matte paper, which was held up to the smartphone’s camera.
-
Replay attacks: 2800 video recordings of bona-fide face videos that were replayed to the target smartphone’s camera. Different phones were paired, such that one of the pair was used to replay the bona-fide videos while the second (attacked) phone recorded the videos using its front camera.
-
Reference
If you use the data for your research or publication, please cite the following paper :
Komaty, Alain, Krivokuca Hahn, Vedrana, Ecabert, Christophe and Marcel, Sébastien, Can personalised hygienic masks be used to attack face recognition systems?, in: Proceedings of IEEE International Joint Conference on Biometrics (IJCB2023), 2023