Skip to content

ziangcao0312/PhysX-3D

Repository files navigation

PhysX-3D: Physical-Grounded 3D Asset Generation

arXiv Project Page

1S-Lab, Nanyang Technological University  2Shanghai AI Laboratory
PhysX provides a new end-to-end paradigm for physical-grounded 3D asset generation.

📖 For more visual results, go checkout our project page

🏆 News

  • Our paper has been accepted to NeurIPS 2025 (Spotlight) 🎉
  • We provide a script for converting our JSON annotations into URDF format 🎉 See urdf_gen.py.

PhysXNet & PhysXNet-XL

For more details about our proposed dataset including dataset structure and annotation, please see this link

The scripts for annotation and obtaining texture information are located in ./tools.

Run this script to convert our json files to URDF.

python urdf_gen.py

PhysXGen

Installation

  1. Clone the repo:
git clone --recurse-submodules https://github.com/ziangcao0312/PhysX-3D.git
cd PhysX-3D 
  1. Create a new conda environment named physxgen and install the dependencies:
. ./setup.sh --new-env --basic --xformers --flash-attn --diffoctreerast --spconv --mipgaussian --kaolin --nvdiffrast

Note: The detailed usage of setup.sh can be found at TRELLIS

Training

  1. Download and preprocess the PhysXNet Dataset
huggingface-cli download Caoza/PhysX-3D PhysXNet.zip --repo-type dataset --local-dir ./dataset_toolkits/
cd dataset_toolkits
unzip PhysXNet.zip -d physxnet

Note: Since PartNet has no texture information, you need to download the ShapeNet dataset and save it to ./dataset_toolkits/shapenet to obtain texture information. The validation (first 1k samples) and test (last 1k samples) splits are stored in val_test_list.npy.

bash precess.sh
  1. VAE training
python train.py 
     --config configs/vae/slat_vae_enc_dec_mesh_phy.json 
     --output_dir outputs/vae_phy 
     --data_dir ./datasets/PhysXNet 
  1. Diffusion training
python train.py 
     --config configs/generation/slat_flow_img_dit_L_phy.json 
     --output_dir outputs/diffusion_phy 
     --data_dir ./datasets/PhysXNet 

Inference

  1. Download the pre-train model from huggingface.
bash download_pretrain.sh
  1. Run the inference code
python example.py

Evaluation

  1. Run the example script
python example_render_gt_foreval.py
  1. Calculate the metrics:

Euclidean distance for absolute scale;

PSNR for density, affordance, and description maps;

Instantiation distance for kinematics;

Note: PhysX-Anything introduces a VLM-based evaluation for kinematics to achieve more flexible and fair evaluation.

References

If you find PhysX useful for your work please cite:

@article{cao2025physx,
  title={PhysX-3D: Physical-Grounded 3D Asset Generation},
  author={Cao, Ziang and Chen, Zhaoxi and Pan, Liang and Liu, Ziwei},
  journal={arXiv preprint arXiv:2507.12465},
  year={2025}
}

@article{physxanything,
  title={PhysX-Anything: Simulation-Ready Physical 3D Assets from Single Image},
  author={Cao, Ziang and Hong, Fangzhou and Chen, Zhaoxi and Pan, Liang and Liu, Ziwei},
  journal={arXiv preprint arXiv:2511.13648},
  year={2025}
}

Acknowledgement

The data and code is based on PartNet and TRELLIS. We would like to express our sincere thanks to the contributors.

🗞️ License

Distributed under the S-Lab License. See LICENSE for more information.

Flag Counter

About

PhysX: Physical-Grounded 3D Asset Generation (NeurIPS 2025, Spotlight)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages