Conference paper Open Access
Giannakopoulos, Theodoros; Konstantopoulos, Stasinos
This paper presents a method for recognizing activities taking place in a home environment. Audio is recorded and analysed realtime, with all computation taking place on a low-cost Raspberry PI. In this way, data acquisition, low-level signal feature calculation, and low-level event extraction is performed without transferring any raw data out of the device. This first-level analysis produces a time-series of low-level audio events and their characteristics: the event type (e.g., "music") and acoustic features that are relevant to further processing, such as "energy" that is indicative of how loud the event was. This output is used by a meta-classifier that extracts long-term features from multiple events and recognizes higher-level activities. The paper also presents experimental results on recognizing kitchen and living-room activities of daily living that are relevant to assistive living and remote health monitoring for the elderly. Evaluation on this dataset has shown that our approach discriminates between six activities with an accuracy of more than 90%, that our two-level classification approach outperforms one-level classification, and that including low-level acoustic features (such as energy) in the input of the meta-classifier significantly boosts performance.