Skip to content

CVPR 2025(Highlight) DexGraspAnything: Towards Universal Robotic Dexterous Grasping with Physics Awareness

License

Notifications You must be signed in to change notification settings

4DVLab/DexGrasp-Anything

Repository files navigation


DexGraspAnything:TowardsUniversalRoboticDexterousGrasping withPhysicsAwareness

CVPR 2025(Highlight)
Yiming Zhong*Qi Jiang*Jingyi YuYuexin Ma
ShanghaiTech University
*Indicates Equal Contribution

📖 Project Page | 📄 Paper Link |

We present DexGrasp Anything, consistently surpassing previous dexterous grasping generation methods across five benchmarks. Visualization of our method's results are shown on the left.

Directional Weight Score

📣 News

  • [2/27/2025] 🎉🎉🎉DexGraspAnything has been accepted by CVPR 2025!!!🎉🎉🎉

😲 Results

Please refer to our homepage for more thrilling results!

📚 Datasets

In our data processing, the rotation and translation formula is ( Y = a(x + b) ), where ( Y ) represents the shadow hand after rotation and translation, and ( x ) is the original hand. It is important to emphasize that we use ( Y = a(x + b) ) rather than ( Y = ax + b ). This formulation allows us to conveniently transfer the rotation to the object, i.e., (O = a^T * O ), enabling more flexible manipulation.

Datasets Huggingface link Google drive Link (Format Compatible with Our Dataloader) Paper
DGA Huggingface Datasets DexGrasp Anything: Towards Universal Robotic Dexterous Grasping with Physics Awareness
Realdex Huggingface Datasets RealDex: Towards Human-like Grasping for Robotic Dexterous Hand
DexGraspNet Huggingface Datasets DexGraspNet: A Large-Scale Robotic Dexterous Grasp Dataset for General Objects Based on Simulation
UniDexGrasp Huggingface Datasets UniDexGrasp: Universal Robotic Dexterous Grasping via Learning Diverse Proposal Generation and Goal-Conditioned Policy
MultiDex Huggingface Datasets GenDexGrasp: Generalizable Dexterous Grasping
DexGRAB(Retargeting from Grab) Huggingface Datasets Grab: A dataset of whole-body human grasping of objects.

🦾 Make your own dataset

  • First, your dataset should contain mesh files of objects. You can create an object_pcds_nors.pkl file by sampling point clouds from these meshes. Modify the paths in Process_your_dataset/make_obj_pcds.py to build your dataset.

    python Process_your_dataset/make_obj_pcds.py
  • Based on your data pose format (such as translation, rotation, qpose, scale), create a corresponding .pt file. You can refer to the input examples in the datasets folder to build it. You need to understand the pose format of your dataset. Pay special attention to whether the translation, rotation, and scale are applied to the object point cloud or to the robotic hand. You can also modify your dataloader to ensure that the input poses and object point cloud data visualize correctly.

  • During the testing phase, you need to generate a URDF file for all meshes. You can use Process_your_dataset/make_obj_urdf.py to generate them (recommended to place in the same directory level as the mesh files).

    python Process_your_dataset/make_obj_urdf.py

🛠️ Setup

    1. Create a new conda environemnt and activate it.(My CUDA version (nvcc --version) is 11.7)
    conda create -n DGA python=3.8
    conda activate DGA
    pip install torch==1.11.0+cu113 torchvision==0.12.0+cu113 --extra-index-url https://download.pytorch.org/whl/cu113
    1. Install the required packages. You can change TORCH_CUDA_ARCH_LIST according to your GPU architecture.
    TORCH_CUDA_ARCH_LIST="7.0;7.5;8.0;8.6" pip install -r requirements.txt

    Please install in an environment with a GPU, otherwise it will error.

    cd src
    git clone https://github.com/wrc042/CSDF.git
    cd CSDF
    pip install -e .
    cd ..
    git clone https://github.com/facebookresearch/pytorch3d.git
    cd pytorch3d
    git checkout tags/v0.7.2  
    FORCE_CUDA=1  TORCH_CUDA_ARCH_LIST="7.5;8.0;8.6"  python setup.py install
    cd ..
    1. Install the Isaac Gym Follow the official installation guide to install Isaac Gym and its dependencies. You will get a folder named IsaacGym_Preview_4_Package.tar.gz put it in ./src/IsaacGym_Preview_4_Package.tar.gz
    tar -xzvf IsaacGym_Preview_4_Package.tar.gz
    cd isaacgym/python
    pip install -e .

Before training and testing, please ensure that you set the dataset path, model size, whether to use LLM, sampling method, and other parameters in configs.

Train

  • Train with a single GPU

    bash scripts/grasp_gen_ur/train.sh ${EXP_NAME}
  • Train with multiple GPUs

    bash scripts/grasp_gen_ur/train_ddm.sh ${EXP_NAME}

Sample

bash scripts/grasp_gen_ur/sample.sh ${exp_dir} [OPT]
# e.g., Running without Physics-Guided Sampling:   bash scripts/grasp_gen_ur/sample.sh /outputs/exp_dir [OPT]
# e.g., Running with Physics-Guided Sampling:   bash scripts/grasp_gen_ur/sample.sh /outputs/exp_dir OPT
  • [OPT] is an optional parameter for Physics-Guided Sampling.

Test

First, you need to run scripts/grasp_gen_ur/sample.sh to sample some results. You also need to set the dataset file paths in /envs/tasks/grasp_test_force_shadowhand.py and /scripts/grasp_gen_ur/test.py`. Then, we will compute quantitative metrics using these sampled results.

bash scripts/grasp_gen_ur/test.sh ${EVAL_DIR} 
# e.g., bash scripts/grasp_gen_ur/test.sh  /outputs/exp_dir/eval/final/2025-03-16_19-15-31

Checkpoints

DexGrasp Anything(W/o LLM) Huggingface Link Google drive Link
Realdex Huggingface CKPT
DexGraspNet Huggingface CKPT
UniDexGrasp Huggingface CKPT
MultiDex Huggingface CKPT
DexGRAB Huggingface CKPT

🚩 Plan

  • Paper Released.
  • Source Code.
  • Dataset.
  • Make your own dataset.
  • Checkpoints.

🎫 License

For academic use, this project is licensed under the 2-clause BSD License.

💓 Acknowledgement

We would like to acknowledge that some codes and datasets are borrowed from Scene-Diffuser, RealDex, DexGraspNet, UniDexGrasp, GRAB, and MultiDex Dataset. We appreciate the authors for their great contributions to the community and for open-sourcing their code and datasets.

🖊️ Citation

@article{zhong2025dexgrasp,
  title={DexGrasp Anything: Towards Universal Robotic Dexterous Grasping with Physics Awareness},
  author={Zhong, Yiming and Jiang, Qi and Yu, Jingyi and Ma, Yuexin},
  journal={arXiv preprint arXiv:2503.08257},
  year={2025}
}

About

CVPR 2025(Highlight) DexGraspAnything: Towards Universal Robotic Dexterous Grasping with Physics Awareness

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Contributors 2

  •  
  •  

Languages