Conference paper Open Access
Zampoglou, Markos; Markatopoulou, Foteini; Mercier, Gregoire; Touska, Despoina; Apostolidis, Evlampios; Papadopoulos, Symeon; Cozien, Roger; Patras, Ioannis; Mezaris, Vasileios; Kompatsiaris, Ioannis
User-Generated Content (UGC) has become an integral part of the news reporting cycle. As a result, the need to verify videos collected from social media and Web sources is becoming increasingly important for news organisations. While video verication is attracting a lot of attention, there has been limited effort so far in applying video forensics to real-world data. In this work we present an approach for automatic video manipulation detection inspired by manual verication approaches. In a typical manual verication setting, video filter outputs are visually interpreted by human experts. We use two such forensics filters designed for manual verication, one based on Discrete Cosine Transform (DCT) coefficients and a second based on video requantization errors, and combine them with Deep Convolutional Neural Networks (CNN) designed for image classication. We compare the performance of the proposed approach to other works from the state of the art, and discover that, while competing approaches perform better when trained with videos from the same dataset, one of the proposed filters demonstrates superior performance in cross-dataset settings. We discuss the implications of our work and the limitations of the current experimental setup, and propose directions for future research in this area.