--- license: cc-by-sa-4.0 size_categories: - 100K Shape Complexity

*Figure 1. Examples of procedurally generated 3D shapes showcasing varying geometric complexity. In this dataset, we only provide data in the category of (d). Please checkout github if you want to render data in different complexity level* ## Key Features - **Size:** 150,000 procedurally generated 3D shapes. - **Representation:** Each shape is sampled with 8,192 surface points. - **Primitives:** Shapes are composed of randomly sampled primitives, including: - Cubes - Spheres - Cylinders - **Augmentations:** - Boolean operations (e.g., difference, union) - Wireframe conversion ## Dataset Size and Performance We evaluated the impact of dataset size on the **PB-T50-RS benchmark** for shape classification using Point-MAE-Zero. Our findings show that performance improves with larger dataset sizes but exhibits diminishing returns beyond a certain threshold.

Impact of Dataset Size

*Figure 2. The effect of dataset size on downstream shape classification performance. Note that our performance is on par with Point-MAE trained with ShapeNet at exactly the same scale.* Additional experiments are available in [our paper](https://arxiv.org/abs/2411.17467). ## Dataset Format The dataset is provided in a format ready for point cloud-based learning: - **Surface Points:** Stored as `.npy` files. - Under **data/result**, we have **152508** sub-directories. And in each directory, we provide **object.npy** and **object_aug.npy**. object_aug.npy contains surface points after augmentations. For example of dataloader, please checkout our [github](https://github.com/UVA-Computer-Vision-Lab/point-mae-zero?tab=readme-ov-file). ## License This dataset is licensed under the [CC BY-SA 4.0 License](https://creativecommons.org/licenses/by-sa/4.0/). You are free to share and adapt the dataset, provided appropriate credit is given and any derivative works are distributed under the same license. Please also check licence here [zeroverse](https://github.com/desaixie/zeroverse). ## Citation If you find this dataset useful in your research, please cite our work: ``` @article{chen2024learning3drepresentationsprocedural, title={Learning 3D Representations from Procedural 3D Programs}, author={Xuweiyi Chen and Zezhou Cheng}, year={2024}, eprint={2411.17467}, archivePrefix={arXiv}, primaryClass={cs.CV}, url={https://arxiv.org/abs/2411.17467}, } ```