Sizhe Lester Li, Annan Zhang, Boyuan Chen, Hanna Matusik, Chao Liu, Daniela Rus, Vincent Sitzmann
📄 Paper (Nature, 2025) | 🌐 Project Website | 📖 Tutorial | 🎥 Explainer | 📦Dataset
[TL;DR] Neural Jacobian Fields are a general-purpose representation of robotic systems that can be learned from perception.
- [2025-06-25] Our paper is now published in Nature.
- [2025-04-20] Dataset now live on HuggingFace: Link.
- [2025-03-23] Major tutorial updates for training in 2D simulations.
We provide the software implementations of:
- 🧠 3D Jacobian Field:
project/neural_jacobian_field - ✋ 2D Jacobian Field:
project/jacobian - 🧪 Custom simulator:
mujoco-phys-sim
conda create --name neural-jacobian-field python=3.10.8
conda activate neural-jacobian-fieldpip install torch==2.1.2+cu118 torchvision==0.16.2+cu118 --extra-index-url https://download.pytorch.org/whl/cu118
conda install -c "nvidia/label/cuda-11.8.0" cuda-toolkit
pip install ninja git+https://github.com/NVlabs/tiny-cuda-nn/#subdirectory=bindings/torch
pip install git+https://github.com/sizhe-li/nerfstudio.git
git submodule update --init --recursive
cd mujoco-phys-sim/phys_sim
pip install -r requirements.txt
pip install -e .
cd project
pip install -r requirements_new.txt
pip install -e .
Download from Google Drive and place them under:
notebooks/inference_demo_data/real_world_pretrained_ckpts
notebooks/tutorial/tutorial_pretrained_ckpts
Tutorial Notebooks (2D, ~30 mins each)
A multiview video-action dataset with camera poses that includes
- Pneumatic robot hand (on robot arm)
- Allegro robot hand
- Handed Shearing Auxetics platform
- Poppy robot arm
python3 -m neural_jacobian_field.train dataset=dataset_allegro model=model_allegro dataset.mode=perception
- Install TAPIR here.
- Use
scripts/dataset/extract_tapir_motion_tracks.pyto extract the motion tracks of the allegro hand. - We are working on a simplified implementation using CoTracker for the motion data extraction process, which will be released before the end of June.
Replace the checkpoint flag with what you have on wandb :) and start training
python3 -m neural_jacobian_field.train dataset=dataset_allegro model=model_allegro dataset.mode=action checkpoint.load=wandb://entity/project/usoftylr:v5
- Extrinsics: OpenCV-style camera-to-world matrices (+Z = look vector, +X = right, –Y = up)
- Intrinsics: Normalized (row 1 ÷ width, row 2 ÷ height)
If you find our work useful, please consider citing us:
@Article{Li2025,
author={Li, Sizhe Lester
and Zhang, Annan
and Chen, Boyuan
and Matusik, Hanna
and Liu, Chao
and Rus, Daniela
and Sitzmann, Vincent},
title={Controlling diverse robots by inferring Jacobian fields with deep networks},
journal={Nature},
year={2025},
month={Jun},
day={25},
issn={1476-4687},
doi={10.1038/s41586-025-09170-0},
url={https://doi.org/10.1038/s41586-025-09170-0}
}
The authors thank Hyung Ju Terry Suh for his writing suggestions (system dynamics) and Tao Chen and Pulkit Agrawal for their hardware support on the Allegro hand. V.S. acknowledges support from the Solomon Buchsbaum Research Fund through MIT’s Research Suppport Committee. S.L.L. was supported through an MIT Presidential Fellowship. A.Z., H.M., C.L., and D.R. acknowledge support from the National Science Foundation EFRI grant 1830901 and the Gwangju Institute of Science and Technology.
