Conference paper Open Access

Detecting tampered videos with multimedia forensics and deep learning

Zampoglou, Markos; Markatopoulou, Foteini; Mercier, Gregoire; Touska, Despoina; Apostolidis, Evlampios; Papadopoulos, Symeon; Cozien, Roger; Patras, Ioannis; Mezaris, Vasileios; Kompatsiaris, Ioannis

User-Generated Content (UGC) has become an integral part of the news reporting cycle. As a result, the need to verify videos collected from social media and Web sources is becoming increasingly important for news organisations. While video verication is attracting a lot of attention, there has been limited effort so far in applying video forensics to real-world data. In this work we present an approach for automatic video manipulation detection inspired by manual verication approaches. In a typical manual verication setting, video filter outputs are visually interpreted by human experts. We use two such forensics filters designed for manual verication, one based on Discrete Cosine Transform (DCT) coefficients and a second based on video requantization errors, and combine them with Deep Convolutional Neural Networks (CNN) designed for image classication. We compare the performance of the proposed approach to other works from the state of the art, and discover that, while competing approaches perform better when trained with videos from the same dataset, one of the proposed filters demonstrates superior performance in cross-dataset settings. We discuss the implications of our work and the limitations of the current experimental setup, and propose directions for future research in this area.

Files (1.5 MB)
Name Size
mmm19_lncs11295_2_preprint.pdf
md5:ac4366cefbb63a0f4b28fed320b558a2
1.5 MB Download
98
59
views
downloads
Views 98
Downloads 59
Data volume 88.9 MB
Unique views 91
Unique downloads 51

Share

Cite as