No Cover Image

Conference Paper/Proceeding/Abstract 663 views

Recognition of unscripted kitchen activities and eating behaviour for health monitoring

S. Whitehouse, K. Yordanova, A. Paiement, M. Mirmehdi, Adeline Paiement

Proceedings of the 2nd IET International Conference on Technologies for Active and Assisted Living, Pages: 1 - 6

Swansea University Author: Adeline Paiement

Full text not available from this repository: check for access using links below.

DOI (Published version): 10.1049/ic.2016.0050

Abstract

Nutrition related health conditions such as diabetes and obesity can seriously impact quality of life for those who are affected by them. A system able to monitor kitchen activities and patients’ eating behaviours could provide clinicians with important information helping them to improve patients’...

Full description

Published in: Proceedings of the 2nd IET International Conference on Technologies for Active and Assisted Living
ISBN: 978-1-78561-393-7
Published: London 2nd IET International Conference on Technologies for Active and Assisted Living (TechAAL 2016) 2016
Online Access: http://ieeexplore.ieee.org/document/7801334/
URI: https://cronfa.swan.ac.uk/Record/cronfa31411
Tags: Add Tag
No Tags, Be the first to tag this record!
Abstract: Nutrition related health conditions such as diabetes and obesity can seriously impact quality of life for those who are affected by them. A system able to monitor kitchen activities and patients’ eating behaviours could provide clinicians with important information helping them to improve patients’ treatments. We propose a symbolic model able to describe unscripted kitchen activities and eating habits of people in home settings. This model consists of an ontology which describes the problem domain, and a Computational State Space Model (CSSM) which is able to reason in a probabilistic manner about a subject’s actions, goals, and causes of any problems during task execution. To validate our model we recorded 15 unscripted kitchen activities involving 9 subjects, with the video data being annotated according to the proposed ontology schemata. We then evaluated the model’s ability to recognise activities and potential goals from action sequences by simulating noisy observations from the annotations. The results showed that our model is able to recognise kitchen activities with an average accuracy of 80% when using specialised models, and with an average accuracy of 40% when using the general model.
College: Faculty of Science and Engineering
Start Page: 1
End Page: 6