This repository contains a prototype VR surgery training system developed as a Bachelor Thesis. The core idea is to let instructors define arbitrary “cutting planes” on a virtual bone model, then allow trainees to practice sawing along those planes in VR while collecting accuracy metrics. Two input methods are supported:
- Controller Version: Uses standard VR controllers (e.g., HTC Vive Pro) for basic vibration feedback.
- SenseGlove Version: Uses SenseGlove DK1 haptic gloves for richer force and vibrotactile feedback.
Both Versions of the prototype use the CuttingPlaneCreation.unity scene for managing Cutting Planes.
A separate non-VR scene is provided for creating and managing cutting planes that are stored for later use in the VR simulator.
To chose what version you want to use, you have to change the version.txt file in this repository. This is not transfered if you make a build of the application. The version file is created when first starting and can be found in the folder /VR Surgery Training System_Data then.
-
Cutting Plane Creation (Non-VR Scene)
-
VR Training Simulation
- Loads a selected cutting plane into the VR environment (either controller or SenseGlove version).
- Trainees use a virtual saw to cut along the predefined plane.
- Real-time metrics are computed on each slice, including:
- Number of times cut too deep
- Maximum depth overshoot
- Percentage of cut segments within the target plane
- Metrics are displayed on a virtual whiteboard during the session and saved to disk for post-session analysis.
-
Input & Haptic Feedback
- Controller Version:
- Scene:
Controller_SurgeryRoom.unity
can be found in /VR Training System/Assests/Scenes - Uses SteamVR controllers (HTC Vive to be exact) for positional tracking and basic vibration feedback.
- Ensure only the active controllers/trackers are enabled before entering the VR scene.
- Scene:
- SenseGlove Version:
- Scene:
Gloves_SurgeryRoom.unity
can be found in /VR Training System/Assests/Scenes - Uses SenseGlove DK1 for force-feedback and vibrotactile cues.
- Provides a higher level of immersion but requires the SenseGlove SDK and hardware to be connected and configured.
- Scene:
- Controller Version:
-
Metrics & Data Logging
- During each trial, the system tracks:
- Total cut length vs. planned plane
- Depth deviations (too shallow or too deep)
- Time to complete the cut
- Results are rendered on a virtual whiteboard for immediate feedback and stored as log files (CSV or JSON) in the application’s data folder.
- Researchers can use these logs to compare user performance across different haptic feedback conditions.
- During each trial, the system tracks:
Google Drive Link
One Drive Link
It uses Unity Version: 2020.3.3f1 [lower versions are not recommended].
The repository uses git lfs --> should be activated
- SteamVR Unity Plugin (Version 2.7.3)
- SenseGlove Unity Plugin (Version 2.2)
- VIVE Input Utility (Version 1.15.0)
- XR Plugin Management (Version 4.0.7)
- Pro Builder (Version 4.5.2)
- Post Processing (Version 3.1.1)
Packages that are not linked can be found directly in the Unity Package Manager or the Asset Store.
In a user study comparing the two input methods, haptic gloves significantly enhanced immersion, but controller‐based feedback scored higher on usability and imposed a significantly lower mental and physical workload. A medical expert validated the system’s feasibility for orthopedic training, confirming it can support practical surgical skill development.
For more details have a look at my complete thesis publication