Athina Tsanousa
Georgios Meditskos
Stefanos Vrochidis
Ioannis Kompatsiaris
2019-07-15
<p>Following the technological advancement and the<br>
constantly emerging assisted living applications, sensor-based activity<br>
recognition research receives great attention. Until recently,<br>
the majority of relevant research involved extracting knowledge<br>
out of single modalities, however, when individual sensors performances<br>
are not satisfactory, combining information from multiple<br>
sensors can be of use and improve the activity recognition rate.<br>
Early and late fusion classifier strategies are usually employed<br>
to successfully merge multiple sensors. This paper proposes a<br>
novel framework for combining accelerometers and gyroscopes<br>
at decision level, in order to recognize human activity. More<br>
specifically, we propose a weighted late fusion framework that<br>
utilizes the detection rate of a classifier. Furthermore, we propose<br>
the modification of an already existing class-based weighted late<br>
fusion framework. Experimental results on a publicly available<br>
and widely used dataset demonstrated that the combination of<br>
accelerometer and gyroscope under the proposed frameworks<br>
improves the classification performance.</p>
https://doi.org/10.5281/zenodo.3507004
oai:zenodo.org:3507004
Zenodo
https://doi.org/10.5281/zenodo.3507003
info:eu-repo/semantics/openAccess
Creative Commons Attribution 4.0 International
https://creativecommons.org/licenses/by/4.0/legalcode
A Weighted Late Fusion Framework for Recognizing Human Activity from Wearable Sensors
info:eu-repo/semantics/conferencePaper