Dataset Open Access

Visuo-motor dataset recorded from a micro-farming robot

Guido Schillaci; Antonio Pico Villalpando

This is the accompanying dataset of the paper [1] describing algorithms for intrinsic motivation and episodic memory on the Sony LettuceThink microfarming robot.

The LettuceThink microfarming robot developed by Sony Computer Science Laboratories consists of an aluminium frame with an X-Carve CNC machine mounted on it. The CNC machine is used to provide 3-axes movements to a depth camera (Sony DepthSense) mounted at the tip of the vertical z-axis (the end-effector camera). In the experiments presented in the paper, the end-effector camera is facing top-down and only two motors are used (x and y).

A simulator of the LettuceThink robot has been developed to ease the testing of different configurations of the learning system. The simulator generates sensorimotor data from requested trajectories of the end-effector camera. Knowing the initial position of the CNC machine and the target position, the simulator linearly interpolates the trajectory and returns the intermediate positions of the camera together with the images captured from each specific position. The sensorimotor data returned by the simulator have been prerecorded by performing a full scan of the (x,y) plane of the CNC machine using a resolution of 5mm. This resulted in 24,964 images, each mapped to an (x,y) position of the CNC machine. The dataset published here contains these images.

In particular, the dataset consists of a set of images, each named with the specific position of the 2 motors of the robot. A python script for generating visuo-motor trajectories (sequences of data consisting of [image, motor_x, motor_y]) from this dataset is available at the following github page: https://github.com/guidoschillaci/sonylettucethink_dataset

Provided with the dataset is also a python script that allows to easily read the images and to generate trajectories (returning

This work has been supported by the EU-H2020 ROMI Project and by the EU-H2020 Marie Sklodowska Curie project "Predictive Robots" (grant agreement no. 838861)References:

[1] Schillaci, G., Villalpando, A. P., Hafner, V. V., Hanappe, P., Colliaux, D., & Wintz, T. (2020). Intrinsic Motivation and Episodic Memories for Robot Exploration of High-Dimensional Sensory Spaces. arXiv preprint arXiv:2001.01982.

Python scripts for using this dataset can be found here: https://github.com/guidoschillaci/sonylettucethink_dataset
Files (1.8 GB)
Name Size
rgb_rectified.zip
md5:ff73f6c6d0e1beef9dd9b11a364b0b3c
1.8 GB Download
  • Schillaci, G., Villalpando, A. P., Hafner, V. V., Hanappe, P., Colliaux, D., & Wintz, T. (2020). Intrinsic Motivation and Episodic Memories for Robot Exploration of High-Dimensional Sensory Spaces. arXiv preprint arXiv:2001.01982.

42
7
views
downloads
All versions This version
Views 4242
Downloads 77
Data volume 12.6 GB12.6 GB
Unique views 3838
Unique downloads 77

Share

Cite as