Published May 4, 2022 | Version v1
Conference paper Open

FAME video browser – face recognition based metadata generation for performing art videos

  • 1. Ghent University-imec, IDLab
  • 2. meemoo, Flemish Institute for Archives
  • 3. Flanders Arts Institute


This demonstrator focuses on one of the end results of the FAME project [1]. The main focus of this project is the development of a generic open-source face recognition pipeline that can be applied on a broad variety of cultural heritage video/image archives, the performing arts collection of the Flanders Arts Institute [2] being one of them. Performing arts images/videos are among the most challenging types of video for face recognition given their dynamic and heterogeneous nature and the variety of domains they span. If it will work on this type of content, it will most probably also work on other types of archived content.   

Due to the lack of accurate shot-level metadata, the searchability of performing art videos is still rather limited. Mostly they can only be queried based on some global metadata. Retrieving the exact shot(s) where an actor or dancer appears in the video, for example, is currently impossible. The shots can of course be manually annotated, but this is an error-prone and labour-intensive process, definitely when it needs to be performed on a large collection. Within the FAME project, we address this issue and proposes a video content annotation tool for the automatic annotation of faces in performing art videos. The resulting JSON metadata (which contains the Wikidata IDs of the recognized people and the timestamps when they appear in the video) is used in a video browser web application which allows to query the video based on the recognized performers and find the shots they appear in.



Files (784.7 kB)

Name Size Download all
784.7 kB Preview Download