Science Robotics'25: Surgical embodied intelligence for generalized task autonomy in laparoscopic robot-assisted surgery [Paper] [Code]
arXiv'24: Efficient Physically-based Simulation of Soft Bodies in Embodied Environment for Surgical Robot [Paper] [Code]
ICRA'24: Multi-objective Cross-task Learning via Goal-conditioned GPT-based Decision Transformers for Surgical Robot Task Automation [Paper] [Code]
IROS'23: Value-Informed Skill Chaining for Policy Learning of Long-Horizon Tasks with Surgical Robot [Paper] [Code]
RA-L'23: Human-in-the-loop Embodied Intelligence with Interactive Simulation Environment for Surgical Robot Learning [Paper] [Code]
ICRA'23: Demonstration-Guided Reinforcement Learning with Efficient Exploration for Task Automation of Surgical Robot [Paper] [Code]
ISMR'22: Integrating Artificial Intelligence and Augmented Reality in Robotic Surgery: An Initial dVRK Study Using a Surgical Education Scenario [Paper]
IROS'21: SurRoL: An Open-source RL Centered and dVRK Compatible Platform for Surgical Robot Learning [Paper] [Code]
- dVRK compatible robots.
- OpenAI Gym style API for reinforcement learning.
- Rich collection of assets and task environments.
- Based on PyBullet and Taichi for physics simulation.
- Allow human interaction with Touch Haptic Device and real-world dVRK robots.
- Zero-shot sim-to-real transfer capabilities
System Requirements: Ubuntu 20.04 with Python 3.7
|- VPPV # surgical task automation framework
|- Training # simulator training
|- data_generation # enviroment for generating the data to train perceptual regressor
|- state_regress # code for training perceptual regressor
|- policy_learning # enviroment for training the RL policy
|- Deployment # VPPV deployment in the real world
|- dVRK # code of VPPV deployment for game-based training tasks
|- Sentire # code of VPPV deployment for ex vivo and in vivo experiments
|- Benchmark # benchmark for policy learning
|- state_based # enviroment and implementation for state based methods
|- vision_based # enviroment and implementation for vision based methods
|- Haptic_guidance # enviroment and implementation for intelligent haptic guidance
|- Data_driven_scene_simulation # enviroment and implementation for data driven surgical scene simulation
The VPPV framework consists of two main components:
-
Data Generation (
VPPV/Training/data_generation/
)- Environments for collecting training data
- Use
python data_generation.py --env ${task_name}
to create datasets for perceptual regressor training
-
State Regression (
VPPV/Training/state_regress/
)- Training scripts for perceptual regressor models
- Run
python train.py
to train the network
-
Policy Learning (
VPPV/Training/policy_learning/
)- RL training environments with pretrained perceptual regressor
- Execute
python3 rl/train.py task=${task_name} agent=ddpg use_wb=True
to train control policies
-
dVRK Integration (
VPPV/Deployment/dVRK/
)- Scripts for deploying game-based training tasks on real-world dVRK
- Configuration files for robot setup
- Run
python super_player.py --task ${task_name}
to excecute VPPV
-
Sentire System (
VPPV/Deployment/Sentire/
)- Code for ex vivo and in vivo experiments
- Run
python super_player.py --task ${task_name}
to excecute VPPV
Implementation of intelligent haptic guidance system:
- Located in
Haptic_guidance/
- Includes haptic feedback algorithms
- Run
python tests/main.py
to start the demo
Tools for realistic surgical scene simulation:
- Found in
Data_driven_scene_simulation/
- Data-driven scene reconstruction and simulation
- Run
python python gs_interaction.py
for simulation
This project was developed on ROS Noetic with dVRK 2.1.
Follow this guide to build and check all prerequisites listed here.
More information about dVRK can be found at dVRK documentation and wiki page.
Follow this guide to calibrate the stereo endoscopic camera.
Follow this guide for hand eye calibration of dVRK.
If you find the paper or the code helpful to your research, please cite the project.
@inproceedings{xu2021surrol,
title={SurRoL: An Open-source Reinforcement Learning Centered and dVRK Compatible Platform for Surgical Robot Learning},
author={Xu, Jiaqi and Li, Bin and Lu, Bo and Liu, Yun-Hui and Dou, Qi and Heng, Pheng-Ann},
booktitle={2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
year={2021},
organization={IEEE}
}
@article{long2025surgical,
title={Surgical embodied intelligence for generalized task autonomy in laparoscopic robot-assisted surgery},
author={Long, Yonghao and Lin, Anran and Kwok, Derek Hang Chun and Zhang, Lin and Yang, Zhenya and Shi, Kejian and Song, Lei and Fu, Jiawei and Lin, Hongbin and Wei, Wang and others},
journal={Science Robotics},
volume={10},
number={104},
pages={eadt3093},
year={2025},
publisher={American Association for the Advancement of Science}
}
The code is released under the MIT license.
The code is built with the reference of dVRK, AMBF, dVRL, RLBench, Decentralized-MultiArm, Ravens, etc.