An Intelligent Eye Gaze Virtual Keyboard with Real Time Blink Interaction
Authors/Creators
Description
This paper presents an AI-assisted virtual keyboard controlled entirely through eye gaze and blink detection, offering a fully hands-free typing interface for users with motor impairments. The system integrates MediaPipe for eye landmark tracking, OpenCV for real-time gaze estimation, and a Tkinter-based graphical interface for virtual keyboard interaction. Users can type via sustained gaze (dwell selection) or blink-triggered actions, with an adaptive debounce mechanism to prevent accidental inputs. To enhance typing fluency, a dual-layer language model is implemented: a lightweight bigram model derived from the Brown corpus for offline prediction, and an optional transformer-based GPT-2 engine for contextual and semantically rich suggestions. Experimental evaluation with two participants demonstrated high blink classification accuracy (94.6%), stable gaze-to-cursor mapping (91.3%), and improved typing speed (up to 9.4 WPM) during AI-assisted sessions. Visual plots of cursor smoothing, blink timelines, and accuracy breakdowns validated system responsiveness and usability. The interface was rated intuitive and fatigue-free, with participants preferring blink overrides and GPT suggestions for longer phrases. The proposed model offers a cost-effective, real-time communication tool that bridges computer vision and AI for assistive technology, operating entirely on standard webcams without proprietary hardware.
Files
An Intelligent Eye Gaze Virtual Keyboard with Real Time Blink Interaction.pdf
Files
(468.5 kB)
| Name | Size | Download all |
|---|---|---|
|
md5:dbd9b4a2b6f7b2a147efc69eb71dec8f
|
468.5 kB | Preview Download |