Published August 10, 2020 | Version v1
Dataset Restricted

ManiGaze

  • 1. Idiap Research Institute

Description

Description

Current systems for gaze estimation are usually trained and evaluated on datasets with relatively ideal near frontal head poses and visual targets in front of the user. While this make it useful for model design and method comparison, their performance in other realistic sensing conditions and setups is relatively unknown. Developing datasets for such situations is thus needed, both to evaluate robustness of above methods, measure their performance, understand their limitations, and trigger new research to push further the state of the art of gaze tracking.

The ManiGaze dataset was designed with these goals in mind. More specifically, it was created to evaluate gaze estimation from remote RGB and RGB-D (standard vision and depth) sensors in Human-Robot Interaction (HRI) settings, and more specifically during object manipulation tasks. The recording methodology was designed to let the user behave freely and encourage a natural interaction with the robot, as well as to automatically collect gaze targets, since a-posteriori annotation is almost impossible for gaze. The dataset involves 17 person who performed four different tasks in four sessions:

  • Marker on the table Targets (MT) session. The robot asks the user to look or point at markers located on a table placed between the robot and himself.

  • End-effector Targets (ET) session. The robot asks the user to look at its end-effector as it moves them in the space between them.

  • Object Manipulation (OM) session. The robot asks the user to perform a sequence of pick-and-place actions using different objects.

  • Set the Table (ST) session. The user is asked to show and explain to the robot how to set a table, with plate, knife and fork, spoon, and glass.

The gaze ground truth was automatically recorded for the first two session, providing a convenient benchmark to evaluate gaze estimation methods. The two last sessions provide additional material for further research (e.g. eye-hand coordination, movement analysis, ...).

 

Reference

If you use this dataset, please cite the following publication:

R. Siegfried, B. Aminian and J.-M. Odobez.
ManiGaze: a Dataset for Evaluating Remote Gaze Estimator in Object Manipulation Situations.
In Symposium on Eye Tracking Research and Applications, June, 2020.

Files

Restricted

The record is publicly accessible, but files are restricted to users with access.

Request access

If you would like to request access to these files, please fill out the form below.

You need to satisfy these conditions in order for this request to be accepted:

Access to the dataset is based on an End-User License Agreement. The use of the dataset is strictly restricted to non-commercial research.

Please provide us the following information about the authorized signatory (MUST hold a permanent position):

  • Full name
  • Name of organization
  • Position / job title
  • Academic / professional email address
  • URL where we can verify the information details

Only academic/professional email addresses from the same organization as the signatory are accepted for the online request. All online requests coming from generic email providers such as gmail will be rejected.

You are currently not logged in. Do you have an account? Log in here

Additional details

Related works

Is documented by
Conference paper: 10.1145/3379156.3391369 (DOI)