No Cover Image

Journal article 247 views

Smart Home Control Using Real-Time Hand Gesture Recognition and Artificial Intelligence on Raspberry Pi 5

Thomas Hobbs, Anwar Ali Orcid Logo

Electronics, Volume: 14, Issue: 20, Start page: 3976

Swansea University Authors: Thomas Hobbs, Anwar Ali Orcid Logo

  • 70718.pdf

    PDF | Version of Record

    © 2025 by the authors. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.

    Download (7.81MB)

Abstract

This paper outlines the process of developing a low-cost system for home appliance control via real-time hand gesture classification using Computer Vision and a custom lightweight machine learning model. This system strives to enable those with speech or hearing disabilities to interface with smart...

Full description

Published in: Electronics
ISSN: 2079-9292
Published: MDPI AG 2025
Online Access: Check full text

URI: https://cronfa.swan.ac.uk/Record/cronfa70718
Abstract: This paper outlines the process of developing a low-cost system for home appliance control via real-time hand gesture classification using Computer Vision and a custom lightweight machine learning model. This system strives to enable those with speech or hearing disabilities to interface with smart home devices in real time using hand gestures, such as is possible with voice-activated ‘smart assistants’ currently available. The system runs on a Raspberry Pi 5 to enable future IoT integration and reduce costs. The system also uses the official camera module v2 and 7-inch touchscreen. Frame preprocessing uses MediaPipe to assign hand coordinates, and NumPy tools to normalise them. A machine learning model then predicts the gesture. The model, a feed-forward network consisting of five fully connected layers, was built using Keras 3 and compiled with TensorFlow Lite. Training data utilised the HaGRIDv2 dataset, modified to consist of 15 one-handed gestures from its original of 23 one- and two-handed gestures. When used to train the model, validation metrics of 0.90 accuracy and 0.31 loss were returned. The system can control both analogue and digital hardware via GPIO pins and, when recognising a gesture, averages 20.4 frames per second with no observable delay.
Keywords: machine learning; Computer Vision; gesture recognition; accessibility; smart home control; landmark normalisation; TensorFlow Lite; OpenCV
College: Faculty of Science and Engineering
Funders: This research received no external funding.
Issue: 20
Start Page: 3976