Journal article Open Access
Panagiotis Giannakeris; Panagiotis C. Petrantonakis; Konstantinos Avgerinakis; Stefanos Vrochidis; Ioannis Kompatsiaris
A novel frst-person human activity recognition framework is pro-posed in this work. Our proposed methodology is inspired by the central role moving objects have in egocentric activity videos. Using a Deep Convolutional
Neural Network we detect objects and develop discriminant object flow his-tograms in order to represent fine-grained micro-actions during short temporal windows. Our framework is based on the assumption that large scale activities
are synthesized by fine-grained micro-actions. We gather all the micro-actions and perform Gaussian Mixture Model clusterization, so as to build a micro-action vocabulary that is later used in a Fisher encoding schema. Results
show that our method can reach 60% recognition rate on the benchmark ADL dataset. The capabilities of the proposed framework are also showcased by profoundly evaluating for a great deal of hyper-parameters and comparing to
other State-of-the-Art works.
Name | Size | |
---|---|---|
mtap2019_giannakeris_et_al_first_person_activity_final_open.pdf
md5:52d9c4d6d5d5a8ce2ff325826c13084c |
3.2 MB | Download |
All versions | This version | |
---|---|---|
Views | 72 | 72 |
Downloads | 167 | 167 |
Data volume | 533.7 MB | 533.7 MB |
Unique views | 65 | 65 |
Unique downloads | 162 | 162 |