Published October 18, 2019 | Version v1
Conference paper Open

Real-time Recognition of Daily Actions Based on 3D Joint Movements and Fisher Encoding

Description

Recognition of daily actions is an essential part of Ambient Assisted Living (AAL) applications and still not fully solved. In this work, we propose a novel framework for the recognition of actions of daily living from depth-videos. The framework is based on low-level human pose movement descriptors extracted from 3D joint trajectories as well as differential values that encode speed and acceleration information. The joints are detected using a depth sensor. The low-level descriptors are then aggregated into discriminative high-level action representations by modeling prototype pose movements with Gaussian Mixtures and then using a Fisher encoding schema. The resulting Fisher vectors are suitable to train Linear SVM classifiers so as to recognize actions in pre-segmented video clips, alleviating the need for additional parameter search with non-linear kernels or neural network tuning. Experimental evaluation on two well-known RGB-D action datasets reveal that the proposed framework achieves close to state-of-the-art performance whilst maintaining high processing speeds.

Notes

This research has been co‐financed by the European Regional Development Fund of the European Union and Greek national funds through the Operational Program Competitiveness, Entrepreneurship and Innovation, under the call RESEARCH - CREATE - INNOVATE (T1EDK-00686) and the EC funded project V4Design (H2020-779962).

Files

giannakeris2019mmm.pdf

Files (245.4 kB)

Name Size Download all
md5:6b16f7207a9a68535f0a2d85ed41f88d
245.4 kB Preview Download

Additional details

Funding

V4Design – Visual and textual content re-purposing FOR(4) architecture, Design and video virtual reality games 779962
European Commission