Conference paper Open Access

Through-Screen Visible Light Sensing Empowered by Embedded Deep Learning

Liu, Hao; Ye, Hanting; Yang, Jie; Wang, Qing

Dublin Core Export

<?xml version='1.0' encoding='utf-8'?>
<oai_dc:dc xmlns:dc="" xmlns:oai_dc="" xmlns:xsi="" xsi:schemaLocation="">
  <dc:creator>Liu, Hao</dc:creator>
  <dc:creator>Ye, Hanting</dc:creator>
  <dc:creator>Yang, Jie</dc:creator>
  <dc:creator>Wang, Qing</dc:creator>
  <dc:description>Motivated by the trend of realizing full screens on devices such as smartphones, in this work we propose through-screen sensing with visible light for the application of fingertip air-writing. The system can recognize handwritten digits with under-screen photodiodes as the receiver. The key idea is to recognize the weak light reflected by the finger when the finger writes the digits on top of a screen. The proposed air-writing system has immunity to scene changes because it has a fixed screen light source. However, the screen is a double-edged sword as both a signal source and a noise source. We propose a data preprocessing method to reduce the interference of the screen as a noise source. We design an embedded deep learning model, a customized model ConvRNN, to model the spatial and temporal patterns in the dynamic and weak reflected signal for air-writing digits recognition. The evaluation results show that our through-screen fingertip air-writing system with visible light can achieve accuracy up to 91%. Results further show that the size of the customized ConvRNN model can be reduced by 94% with less
than a 10% drop in performance.</dc:description>
  <dc:title>Through-Screen Visible Light Sensing Empowered by Embedded Deep Learning</dc:title>
Views 24
Downloads 12
Data volume 15.6 MB
Unique views 24
Unique downloads 12


Cite as