No Cover Image

Journal article 601 views 187 downloads

Energy expenditure estimation using visual and inertial sensors

Lili Tao, Tilo Burghardt, Majid Mirmehdi, Dima Damen, Ashley Cooper, Massimo Camplani, Sion Hannuna, Adeline Paiement, Ian Craddock

IET Computer Vision

Swansea University Author: Adeline Paiement

Abstract

Deriving a person’s energy expenditure accurately forms the foundation for tracking physical activity levels across many health and lifestyle monitoring tasks. In this work, we present a method for estimating calorific expenditure from combined visual and accelerometer sensors by way of an RGB-Depth...

Full description

Published in: IET Computer Vision
ISSN: 1751-9632 1751-9640
Published: 2017
Online Access: Check full text

URI: https://cronfa.swan.ac.uk/Record/cronfa36266
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract: Deriving a person’s energy expenditure accurately forms the foundation for tracking physical activity levels across many health and lifestyle monitoring tasks. In this work, we present a method for estimating calorific expenditure from combined visual and accelerometer sensors by way of an RGB-Depth camera and a wearable inertial sensor. The proposed individual independent framework fuses information from both modalities which leads to improved estimates beyond the accuracy of single modality and manual metabolic lookup table (MET) based methods. For evaluation, we introduce a new dataset called SPHERE_RGBD+Inertial_calorie, for which visual and inertial data is simultaneously obtained with indirect calorimetry ground truth measurements based on gas exchange. Experiments show that the fusion of visual and inertial data reduces the estimation error by 8% and 18% compared to the use of visual only and inertial sensor only, respectively, and by 33% compared to a MET-based approach. We conclude from our results that the proposed approach is suitable for home monitoring in a controlled environment.
College: Faculty of Science and Engineering