Conference paper Open Access

A Weighted Late Fusion Framework for Recognizing Human Activity from Wearable Sensors

Athina Tsanousa; Georgios Meditskos; Stefanos Vrochidis; Ioannis Kompatsiaris

Dublin Core Export

<?xml version='1.0' encoding='utf-8'?>
<oai_dc:dc xmlns:dc="" xmlns:oai_dc="" xmlns:xsi="" xsi:schemaLocation="">
  <dc:creator>Athina Tsanousa</dc:creator>
  <dc:creator>Georgios Meditskos</dc:creator>
  <dc:creator>Stefanos Vrochidis</dc:creator>
  <dc:creator>Ioannis Kompatsiaris</dc:creator>
  <dc:description>Following the technological advancement and the
constantly emerging assisted living applications, sensor-based activity
recognition research receives great attention. Until recently,
the majority of relevant research involved extracting knowledge
out of single modalities, however, when individual sensors performances
are not satisfactory, combining information from multiple
sensors can be of use and improve the activity recognition rate.
Early and late fusion classifier strategies are usually employed
to successfully merge multiple sensors. This paper proposes a
novel framework for combining accelerometers and gyroscopes
at decision level, in order to recognize human activity. More
specifically, we propose a weighted late fusion framework that
utilizes the detection rate of a classifier. Furthermore, we propose
the modification of an already existing class-based weighted late
fusion framework. Experimental results on a publicly available
and widely used dataset demonstrated that the combination of
accelerometer and gyroscope under the proposed frameworks
improves the classification performance.</dc:description>
  <dc:title>A Weighted Late Fusion Framework for Recognizing Human Activity from Wearable Sensors</dc:title>
All versions This version
Views 3434
Downloads 291291
Data volume 61.4 MB61.4 MB
Unique views 3333
Unique downloads 283283


Cite as