Skip to content

tianyu1997/neural-jacobian-field

 
 

Repository files navigation

Neural Jacobian Fields

Sizhe Lester Li, Annan Zhang, Boyuan Chen, Hanna Matusik, Chao Liu, Daniela Rus, Vincent Sitzmann
📄 Paper (Nature, 2025) | 🌐 Project Website | 📖 Tutorial | 🎥 Explainer | 📦Dataset

[TL;DR] Neural Jacobian Fields are a general-purpose representation of robotic systems that can be learned from perception.

explainer

📢 Announcements

  • [2025-06-25] Our paper is now published in Nature.
  • [2025-04-20] Dataset now live on HuggingFace: Link.
  • [2025-03-23] Major tutorial updates for training in 2D simulations.

🚀 Quickstart

We provide the software implementations of:

  • 🧠 3D Jacobian Field: project/neural_jacobian_field
  • ✋ 2D Jacobian Field: project/jacobian
  • 🧪 Custom simulator: mujoco-phys-sim

📦 Installation

1. Create Conda Environment

conda create --name neural-jacobian-field python=3.10.8
conda activate neural-jacobian-field

2. Install Dependencies (CUDA 11.8)

pip install torch==2.1.2+cu118 torchvision==0.16.2+cu118 --extra-index-url https://download.pytorch.org/whl/cu118
conda install -c "nvidia/label/cuda-11.8.0" cuda-toolkit
pip install ninja git+https://github.com/NVlabs/tiny-cuda-nn/#subdirectory=bindings/torch
pip install git+https://github.com/sizhe-li/nerfstudio.git

3. Install Simulator

git submodule update --init --recursive
cd mujoco-phys-sim/phys_sim
pip install -r requirements.txt
pip install -e .

4. Install Jacobian Fields Codebase

cd project
pip install -r requirements_new.txt
pip install -e .

▶️ Running the Code

📥 Download Pretrained Checkpoints

Download from Google Drive and place them under:

notebooks/inference_demo_data/real_world_pretrained_ckpts
notebooks/tutorial/tutorial_pretrained_ckpts

🧪 Simulated Experiments

FingerExample

Tutorial Notebooks (2D, ~30 mins each)

🦾 Real-World Experiments

✔️ Ready-to-Run Demos

📦 Dataset (HuggingFace)

A multiview video-action dataset with camera poses that includes

  • Pneumatic robot hand (on robot arm)
  • Allegro robot hand
  • Handed Shearing Auxetics platform
  • Poppy robot arm

🏋️‍♀️ Training

A. Train Perception Module (PixelNeRF)

python3 -m neural_jacobian_field.train dataset=dataset_allegro model=model_allegro dataset.mode=perception

B. Train Jacobian Fields

Visual motion extraction

  • Install TAPIR here.
  • Use scripts/dataset/extract_tapir_motion_tracks.py to extract the motion tracks of the allegro hand.
  • We are working on a simplified implementation using CoTracker for the motion data extraction process, which will be released before the end of June.

Replace the checkpoint flag with what you have on wandb :) and start training

python3 -m neural_jacobian_field.train dataset=dataset_allegro model=model_allegro dataset.mode=action checkpoint.load=wandb://entity/project/usoftylr:v5

🎥 Camera Conventions

  • Extrinsics: OpenCV-style camera-to-world matrices (+Z = look vector, +X = right, –Y = up)
  • Intrinsics: Normalized (row 1 ÷ width, row 2 ÷ height)

📚 Citation

If you find our work useful, please consider citing us:

@Article{Li2025,
  author={Li, Sizhe Lester
  and Zhang, Annan
  and Chen, Boyuan
  and Matusik, Hanna
  and Liu, Chao
  and Rus, Daniela
  and Sitzmann, Vincent},
  title={Controlling diverse robots by inferring Jacobian fields with deep networks},
  journal={Nature},
  year={2025},
  month={Jun},
  day={25},
  issn={1476-4687},
  doi={10.1038/s41586-025-09170-0},
  url={https://doi.org/10.1038/s41586-025-09170-0}
}

🙏 Acknowledgements

The authors thank Hyung Ju Terry Suh for his writing suggestions (system dynamics) and Tao Chen and Pulkit Agrawal for their hardware support on the Allegro hand. V.S. acknowledges support from the Solomon Buchsbaum Research Fund through MIT’s Research Suppport Committee. S.L.L. was supported through an MIT Presidential Fellowship. A.Z., H.M., C.L., and D.R. acknowledge support from the National Science Foundation EFRI grant 1830901 and the Gwangju Institute of Science and Technology.

About

Controlling diverse robots by inferring jacobian fields with deep networks! Let's make robots understand where their bodies are!

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 91.7%
  • Python 8.2%
  • Shell 0.1%