📖 For more visual results, go checkout our project page
- Our paper has been accepted to NeurIPS 2025 (Spotlight) 🎉
- We provide a script for converting our JSON annotations into URDF format 🎉 See
urdf_gen.py.
For more details about our proposed dataset including dataset structure and annotation, please see this link
The scripts for annotation and obtaining texture information are located in ./tools.
Run this script to convert our json files to URDF.
python urdf_gen.py- Clone the repo:
git clone --recurse-submodules https://github.com/ziangcao0312/PhysX-3D.git
cd PhysX-3D
- Create a new conda environment named
physxgenand install the dependencies:
. ./setup.sh --new-env --basic --xformers --flash-attn --diffoctreerast --spconv --mipgaussian --kaolin --nvdiffrastNote: The detailed usage of setup.sh can be found at TRELLIS
- Download and preprocess the PhysXNet Dataset
huggingface-cli download Caoza/PhysX-3D PhysXNet.zip --repo-type dataset --local-dir ./dataset_toolkits/
cd dataset_toolkits
unzip PhysXNet.zip -d physxnetNote: Since PartNet has no texture information, you need to download the ShapeNet dataset and save it to ./dataset_toolkits/shapenet to obtain texture information. The validation (first 1k samples) and test (last 1k samples) splits are stored in val_test_list.npy.
bash precess.sh- VAE training
python train.py
--config configs/vae/slat_vae_enc_dec_mesh_phy.json
--output_dir outputs/vae_phy
--data_dir ./datasets/PhysXNet - Diffusion training
python train.py
--config configs/generation/slat_flow_img_dit_L_phy.json
--output_dir outputs/diffusion_phy
--data_dir ./datasets/PhysXNet - Download the pre-train model from huggingface.
bash download_pretrain.sh- Run the inference code
python example.py- Run the example script
python example_render_gt_foreval.py- Calculate the metrics:
Euclidean distance for absolute scale;
PSNR for density, affordance, and description maps;
Instantiation distance for kinematics;
Note: PhysX-Anything introduces a VLM-based evaluation for kinematics to achieve more flexible and fair evaluation.
If you find PhysX useful for your work please cite:
@article{cao2025physx,
title={PhysX-3D: Physical-Grounded 3D Asset Generation},
author={Cao, Ziang and Chen, Zhaoxi and Pan, Liang and Liu, Ziwei},
journal={arXiv preprint arXiv:2507.12465},
year={2025}
}
@article{physxanything,
title={PhysX-Anything: Simulation-Ready Physical 3D Assets from Single Image},
author={Cao, Ziang and Hong, Fangzhou and Chen, Zhaoxi and Pan, Liang and Liu, Ziwei},
journal={arXiv preprint arXiv:2511.13648},
year={2025}
}
The data and code is based on PartNet and TRELLIS. We would like to express our sincere thanks to the contributors.
Distributed under the S-Lab License. See LICENSE for more information.
