Cortacero, Kevin
Fischer, Tobias
Demiris, Yiannis
2019-10-27
<p>The RT-BENE dataset is licensed under <a href="https://creativecommons.org/licenses/by-nc-sa/4.0/">CC BY-NC-SA 4.0</a>. Commercial usage is not permitted. If you use our blink estimation code or dataset, please cite the relevant <a href="http://openaccess.thecvf.com/content_ICCVW_2019/html/GAZE/Cortacero_RT-BENE_A_Dataset_and_Baselines_for_Real-Time_Blink_Estimation_in_ICCVW_2019_paper.html">paper</a>:</p>
<blockquote>
<p>@inproceedings{CortaceroICCV2019W,<br>
author={Kevin Cortacero and Tobias Fischer and Yiannis Demiris},<br>
booktitle = {Proceedings of the IEEE International Conference on Computer Vision Workshops},<br>
title = {RT-BENE: A Dataset and Baselines for Real-Time Blink Estimation in Natural Environments},<br>
year = {2019},<br>
}</p>
</blockquote>
<p>More information can be found on the Personal Robotic Lab's website: <a href="https://www.imperial.ac.uk/personal-robotics/software/">https://www.imperial.ac.uk/personal-robotics/software/</a>.</p>
<p><strong>Overview</strong></p>
<p>We manually annotated images that are contained in the "noglasses" part of the <a href="https://zenodo.org/record/2529036">RT-GENE dataset</a> with blink annotations. This dataset contains the extracted eye image patches and associated annotations.</p>
<p>In particular, <em>rt_bene_subjects.csv</em> is an overview CSV file with the following columns:</p>
<ol>
<li>id</li>
<li>subject csv file</li>
<li>path to left eye images</li>
<li>path to right eye images</li>
<li>training/validation/discarded category</li>
<li>fold-id for the 3-fold evaluation.</li>
</ol>
<p>Each individual "blink_labels" CSV file (<em>s000_blink_labels.csv</em> to <em>s016_blink_labels.csv</em>) contains two columns:</p>
<ol>
<li>image file name</li>
<li>label, where 0.0 is the annotation for open eyes, 1.0 for blinks and 0.5 for annotator disagreement (these images are discarded)</li>
</ol>
<p><strong>Associated code</strong></p>
<p>Please see the <a href="https://github.com/Tobias-Fischer/rt_gene">code repository</a> for code allowing to train and evaluate a deep neural network based on the RT-BENE dataset. The code repository also links to pre-trained models and code for real-time inference.</p>
We thank the Personal Robotics Lab members at Imperial College for their support during this research. This work was supported by the European Union H2020 Framework Programme (Project PAL, H2020-PHC-643783), and a Royal Academy of Engineering Chair in Emerging Technologies.
https://doi.org/10.1109/ICCV.2019
oai:zenodo.org:3685316
Zenodo
https://hdl.handle.net/10044/1/74529
https://zenodo.org/communities/eu
info:eu-repo/semantics/openAccess
Creative Commons Attribution Non Commercial Share Alike 4.0 International
https://creativecommons.org/licenses/by-nc-sa/4.0/legalcode
ICCV2019W, IEEE International Conference on Computer Vision Workshops, Seoul, Korea, 27 October 2019
blink estimation
gaze estimation
computer vision
robotics
ICCV2019
RT-BENE: A Dataset and Baselines for Real-Time Blink Estimation in Natural Environments
info:eu-repo/semantics/other