Zixuan Chen*,1,2
|
Mazeyu Ji*,1
|
Xuxin Cheng1
|
Xuanbin Peng1
Xue Bin Peng†2
|
Xiaolong Wang†1
1 UC San Diego
2 Simon Fraser University
* Equal Contribution
† Equal Advising
This codebase supports Mujoco simulation of motion tracking on Unitree G1 robot. We provide a pretrained checkpoint and several example motions. This codebase is lightweight and easy to use. We have tested it both on Linux and M1 MacOS.
First, clone this repo and install all the dependencies:
conda create -n gmt python=3.8 && conda activate gmt
pip3 install torch torchvision torchaudio
pip install "numpy==1.23.0" pydelatin tqdm opencv-python ipdb imageio[ffmpeg] mujoco mujoco-python-viewer scipy matplotlib
Then you can start to test the pretrained policy's performance on several example motions by running the following command:
python sim2sim.py --robot g1 --motion walk_stand.pkl
To change motions, you can replace walk_stand.pkl
with other motions in the motions folder.
You can also view the kinematics motion by running:
python view_motion.py --motion walk_stand.pkl
Although the pretrained policy has been successfully tested on our machine, the performance of the policy might vary on different robots. We cannot guarantee the success of deployment on every machine. The model we provide is for research use only, and we disclaim all responsibility for any harm, loss, or malfunction arising from its deployment.
- Data processing and retargeter code will be released soon.
- The Mujoco simulation script is originally adapted from LCP.
- For human motion part, we mainly refer to ASE and PHC.
If you find this codebase useful, please consider citing our work:
@article{chen2025gmt,
title={GMT: General Motion Tracking for Humanoid Whole-Body Control},
author={Chen, Zixuan and Ji, Mazeyu and Cheng, Xuxin and Peng, Xuanbin and Peng, Xue Bin and Wang, Xiaolong},
journal={arXiv:2506.14770},
year={2025}
}