Dataset Open Access
Ilker Yildirim; Robert A.Jacobs
This is the second See & Grasp data set introduced from Yildirim & Jacobs (2013). It is a data set containing both the visual and haptic features for "Fribbles". The set of "Fribbles" is larger than the original See & Grasp dataset, which is available here.
Fribbles are complex, 3-D objects with multiple parts and spatial relations among the parts. Moreover, Fribbles have a categorical structure—that is, each Fribble is an exemplar from a category formed by perturbing a category prototype. There are 4 Fribble families, each with 4 species. Each Fribble has 4 slots, and one of 3 possible parts are attached to each slot, leading to (4 × (34)) = 324 total Fribbles in each family. The unmodified 3-D object files for the whole set of Fribbles can be found on Mike Tarr's (Department of Psychology, Carnegie Mellon University) web pages (TarrLab webpage).
The See & Grasp data set contains 891 items corresponding to 891 Fribbles (324 for the A and C families, and 243 for family B because the 3D model files for one of 4 species in family B seem to be corrupt). There are 3 entries associated with each item. One entry is the 3-D object model for a Fribble. The second entry is an image of a Fribble rendered from a canonical viewpoint so that the Fribble's parts and spatial relations among the parts are clearly visible. (Using the 3-D object model, users can easily generate new images of a Fribble from any desired viewpoint.) The third entry is a way of representing a Fribble's haptic features. It is a set of joint angles obtained from a grasp simulator known as "GraspIt!" (Miller & Allen, 2004). GraspIt! contains a simulator of a human hand. When forming the representation of a Fribble's haptic features, the input to GraspIt! was the 3-D object model for the Fribble. Its output was a set of 16 joint angles of the fingers of a simulated human hand obtained when the simulated hand "grasped" the Fribble. Grasps—or closings of the fingers around a Fribble—were performed using GraspIt!'s AutoGrasp function. Each Fribble was grasped 24 times, with the fribble rotated 8 times (by 45°) on each axis. To be sure that Fribbles fit inside GraspIt!'s hand, their sizes were reduced by 29%. Please see the figure for sample images from the dataset and an illustration of the grasping procedure.
The file contains three folders and a CSV file. The folders “Fa”, “Fb”, and “Fc” contain the files for each family of Fribble we examined—A, B, and C. These files each contain the following folders:
The files are named based on the parts which were combined to make that Fribble. E.g., 4_a1b3c2d3 was made by combining 4_body.obj, 4_a1.obj, 4_b3.obj, 4_c2.obj, and 4_d3.obj.
Finally, “haptics.csv” contains the haptic features for each of the Fribbles. It is a CSV file with 892 rows (including one header) and 255 columns. The first column is the Fribble's species and name, the next 254 columns are the joint angles from Graspit!: we performed a grasp for each of 24 different angles (rotating 45° around each axis), and each grasp has 16 DOF values associated with it, representing the position of the hand's joints.
The rest of the columns contain the data for the contact points between the Fribble and the hand in Graspit. The first number for each grasp (there are 24 grasps for each Fribble) is the number of contact points for that grasp. For each contact point, the first six numbers relate to the contact's wrench, the next 3 numbers specify the contact's location in body coordinates, and the last number specifies the scalar constraint error for that contact.
Sample code for rendering objects visually with VTK and haptically with GraspIt can be seen in the folder SampleCode. Please read GraspItUpdate.pdf file to see how to update GraspIt’s TCP server code for haptic rendering. Sample code folder also contains aomr.xml, the world file (i.e., setup of the human hand) used for haptic rendering in GraspIt.
Please cite the following paper in relation to the See & Grasp 2 data set.
Yildirim, I. & Jacobs, R. A. (2013). Transfer of object category knowledge across visual and haptic modalities: Experimental and computational studies. Cognition, 126, 135-148.
Citation for GraspIt!
Miller, A., & Allen, P. K. (2004). Graspit!: A versatile simulator for robotic grasping. IEEE Robotics and Automation Magazine, 11, 110-122.