Through-Screen Visible Light Sensing Empowered by Embedded Deep Learning
Description
Motivated by the trend of realizing full screens on devices such as smartphones, in this work we propose through-screen sensing with visible light for the application of fingertip air-writing. The system can recognize handwritten digits with under-screen photodiodes as the receiver. The key idea is to recognize the weak light reflected by the finger when the finger writes the digits on top of a screen. The proposed air-writing system has immunity to scene changes because it has a fixed screen light source. However, the screen is a double-edged sword as both a signal source and a noise source. We propose a data preprocessing method to reduce the interference of the screen as a noise source. We design an embedded deep learning model, a customized model ConvRNN, to model the spatial and temporal patterns in the dynamic and weak reflected signal for air-writing digits recognition. The evaluation results show that our through-screen fingertip air-writing system with visible light can achieve accuracy up to 91%. Results further show that the size of the customized ConvRNN model can be reduced by 94% with less
than a 10% drop in performance.
Files
AIChallengeIoT2021_Through-Screen VLS.pdf
Files
(1.3 MB)
Name | Size | Download all |
---|---|---|
md5:9fc03485ef13a9329b8301fc86ffa802
|
1.3 MB | Preview Download |