Hand Gesture Recognition for User Interface Control via Camera Feed
Authors/Creators
Description
Abstract—Machine Learning-Based Hand Gesture Recognition for User Interface Control via Camera Feed for Automobile Technicians, Assisting Them in Interacting with the System While Working. Gesture-based interaction has emerged as a compelling paradigm for enhancing human-computer communication, offering a more intuitive alternative to traditional input methods. Leveraging Google Media Pipe Hands, a state-of-the-art machine learning solution, The model demonstrate real-time hand tracking and gesture recognition capabilities. The HaGRID dataset, comprising over 552,992 high-resolution hand gesture images, serves as the foundation for training our Gesture Recognition model using the Random Forest algorithm. This versatile algorithm, renowned for its robustness and effectiveness, enables accurate and stable predictions for activating corresponding UI control actions. Through a detailed exploration of these technologies, attendees will gain insights into the practical applications of gesture recognition, paving the way for immersive and intuitive human-computer interaction experiences in diverse domains.
Files
Hand Gesture Recognition for User Interface Control via Camera Feed.pdf
Files
(366.7 kB)
| Name | Size | Download all |
|---|---|---|
|
md5:8a3e8d383269e5f633c4b3a1df53bbd6
|
366.7 kB | Preview Download |