File size: 2,630 Bytes
8616c73
 
 
 
 
c7e7223
8616c73
 
b9dedf6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
---
tags:
- robotics
- grasping
- simulation
license: "cc-by-4.0"
---

# GraspGen: Scaling Simulated Grasping
GraspGen is a large-scale simulated grasp dataset for multiple robot embodiments and grippers 

<img src="assets/cover.png" width="1000" height="250" title="readme1"> 


We release over 57 million grasps, computed for a subset of 8515 objects from the [Objaverse XL](https://objaverse.allenai.org/) (LVIS) dataset. We release grasps for three grippers: Franka Panda, the Robotiq-2f-140 industrial gripper, and suction. 

<img src="assets/montage2.png" width="1000" height="500" title="readme2"> 

## Dataset Format
The dataset is released in the [WebDataset](https://github.com/webdataset/webdataset) format. The folder structure of the dataset is as follows:
```
grasp_data/
	franka/shard_{0-7}.tar
	robotiq2f140/shard_{0-7}.tar
	suction/shard_{0-7}.tar
splits/
	franka/{train/valid}_scenes.json
	robotiq2f140/{train/valid}_scenes.json
	suction/{train/valid}_scenes.json
```
We release test-train splits along with the grasp dataset. 

Each json file in the shard has the following data in a python dictionary. Note that `num_grasps=2000` per object.
```
‘object’/
	‘scale’ # This is the scale of the asset
‘grasps’/
	‘object_in_gripper’ # boolean mask indicating grasp success, [num_grasps X 1]
	‘transforms’ # Pose of the gripper in homogenous matrices, [num_grasps X 4 X 4]
```

## Visualizing the dataset

We have provided some standalone scripts for visualizing this dataset. See the header of the [visualize_dataset.py](scripts/visualize_dataset.py) for installation instructions

Before running any of the visualization scripts, remember to start meshcat-server in a separate terminal:
``` shell
meshcat-server
```

To visualize a single object from the dataset, alongside its grasps:
```shell
cd scripts/ && python visualize_dataset.py --dataset_path /path/to/dataset --object_uuid {object_uuid} --object_file /path/to/mesh --gripper_name {choose from: franka, suction, robotiq2f140}
```

## Objaverse dataset
Please download the Objaverse XL (LVIS) objects separately. See the helper script [download_objaverse.py](scripts/download_objaverse.py) for instructions and usage.

## License
License Copyright © 2025, NVIDIA Corporation & affiliates. All rights reserved.


The dataset is released under a CC-BY 4.0 License.

The visualization code is released under the [NVIDIA source code license](LICENSE).

## Contact

Please reach out to [Adithya Murali](adithyamurali.com) (admurali@nvidia.com) and [Clemens Eppner](https://clemense.github.io/) (ceppner@nvidia.com) for further enquiries