This repository contains the TerraSLAM system, a comprehensive system support to transform SLAM localization results from local, relative coordinates to global, geospatial ones like GPS. If you find this work helpful, please consider citing:
@inproceedings{xu2025terraslam,
author = {Xu, Jingao and Bala, Mihir and Eiszler, Thomas and Chen, Xiangliang and Dong, Qifei and Chanana, Aditya and Pillai, Padmanabhan and Satyanarayanan, Mahadev},
booktitle = {Proceedings of the ACM MobiSys},
title = {TerraSLAM: Towards GPS-Denied Localization},
year = {2025},
}
To run TerraSLAM and reproduce the demo above, you need to launch three modules:
- TerraSLAM Docker with ORB-SLAM3 ROS2 Interface: Dockerized TerraSLAM with wrapper for ORB-SLAM3 on ROS 2 Humble (Ubuntu 22.04). The ORB-SLAM3 wrapper is partially based on suchetanrs's work
- TerraSLAM Relay: Calculate the transformation matrix among the SLAM, GIS, and GPS coordinate systems, and use it to convert SLAM localization results into GPS coordinates (latitude, longitude, altitude)
- TerraSLAM Visualization: Display the localization results on Google Maps (right part), and use Blender 2.93 to render the localization results within the GIS model (left side)
TerraSLAM provides the dockerized wrapper for ORB-SLAM3 on ROS 2 Humble for Ubuntu 22.04.
Currently, it supports both Monocular
and Stereo
setup for ORB-SLAM3.
git clone https://github.com/cmusatyalab/TerraSLAM.git
cd TerraSLAM
git submodule update --init --recursive --remote
Skip this step if you have already installed docker
cd TerraSLAM
sudo chmod +x container_root/shell_scripts/docker_install.sh
./container_root/shell_scripts/docker_install.sh
- Build the image:
sudo docker build -t orb-slam3-humble:22.04 .
- You can see the built images on your machine by running
sudo docker images | grep orb-slam3-humble
- (Optional) Add
xhost +
to your.bashrc
to support correct x11-forwarding usingecho "xhost +" >> ~/.bashrc source ~/.bashrc
- Run TerraSLAM container
cd TerraSLAM sudo docker compose run TerraSLAM
- This should take you inside the container. Once you are inside, run the command
xeyes
and a pair of eyes should pop-up. If they do, x11 forwarding has correctly been setup on your computer. - Once you have constructed the container, you can further work into it through:
or
docker exec -it -e DISPLAY=$DISPLAY TerraSLAM /bin/bash
docker exec -it -e DISPLAY=$DISPLAY @container_id /bin/bash
- In the constructed container, please firstly setup bash environments by
cd && mv bashrc_temp .bashrc source .bashrc
Launch the container and then:
cd /root/colcon_ws/
colcon build --symlink-install
source install/setup.bash
Here we use the above demo as an example to show how to run TerraSLAM.
-
Video Frames: in the TerraSLAM container, download the compressed drone-captured video frame folder, Mill-video, uncompress it, and move it into the
Database
folder. Also, copy theimage_publish.py
from theTerraSLAM_runtime
folder to theDatabase
folder.cd && mkdir -p Database cd Database wget https://storage.cmusatyalab.org/terra-slam/mill-video.tgz tar xzvf mill-video.tgz cp TerraSLAM_runtime/image_publish.py Database/image_publish.py
-
Pre-built SLAM Map: in the TerraSLAM container, download the SLAM map, Mill-19-Map into the
Map
folder.cd && mkdir -p Map cd Map wget https://storage.cmusatyalab.org/terra-slam/Map-Mill-19-2024.osa
Please refer to the
ORB_SLAM3
submodule's README for instructions on how to save, import, and merge maps created by SLAM. -
SLAM-GPS Transformation Matrix: in the TerraSLAM container, download the Mill-19 transformation matrix, transform.json into the
TerraSLAM_relay
folder.cd /root/TerraSLAM_relay wget https://storage.cmusatyalab.org/terra-slam/transform.json
Similarly, if you want to learn more about how to compute the transformation matrix between SLAM and GPS, please refer to the code and README in the
SLAM-GPS-align
subfolder.
Inside the TerraSLAM container:
cd ~
ros2 launch orb_slam3_ros2_wrapper unirobot_mono.launch.py
This command is used for running the demo with a monocular camera. Of course, TerraSLAM also supports stereo cameras by using
ros2 launch orb_slam3_ros2_wrapper unirobot.launch.py
In another terminal, start a new TerraSLAM container processes by
docker exec -it TerraSLAM /bin/bash
Then,
cd TerraSLAM_relay
python3 relay.py
In another terminal, start a new TerraSLAM container processes and run
cd Database
python3 image_publish.py Mill-video/
You will see the printed GPS coordinates in the terminal logs of the former relay.py
On your local machine (no need to enter a new TerraSLAM container),
cd container_root/TerraSLAM_relay
python3 gui_client.py ../Database/Mill-video -s 127.0.0.1 -p 43322 --google-api-key @your_google_api
This will send the images to the ORB-SLAM3 wrapper for SLAM processing. At the same time, a GUI will launch to display the current GPS coordinates and plot the results on a map. By default, the results will be displayed on OpenStreetMap. However, if you provide a Google API key, the results will be shown on Google Map.
-
Download Blender version 2.93. In our tests, Blender versions above 3.1 also work, but we found that Blender 2.93 is a more stable and compatible version across different platforms.
-
Download the Blender Project File.
wget https://storage.cmusatyalab.org/terra-slam/mill-blender.tgz tar xzvf mill-blender.tgz
Make sure that the
Mill-19.blend
and the model texture foldermill-19-half-q0to84q
are in the same directory and follow the file structure:mill-blender/ ├── Mill-19.blend # Open it in Blender ├── mill-19-half-q0to84q/ # 3D GIS Models about Mill-19 ├── *.bin, *.png, ... # GIS Modesl metadata
This project contains the 3D GIS model of the Mill-19 area, the SLAM point cloud, the alignment between SLAM and GIS models, a drone model, and the Blender scripts interacted with TerraSLAM. Open
Mill-19.blend
in Blender. -
In Blender's menu bar, switch to the
scripting
workspace. In the editor panel, select and rundrone_pose.py
. This will start a Blender server (at127.0.0.1:11223
) that receives the drone pose calculated by TerraSLAM from the client and renders the result in the 3D GIS model. After running the script, you can return to the defaultlayout
workspace. -
Run TerraSLAM as the above and send pose data to blender:
- In the previous terminals inside the TerraSLAM, run:
and
ros2 launch orb_slam3_ros2_wrapper unirobot_mono.launch.py
cd Database python3 image_publish.py Mill-video
- In a new terminal inside the TerraSLAM container, run:
cd TerraSLAM_runtime python3 pose_tcp.py
You will see a virtual drone flying in the GIS world!
- In the previous terminals inside the TerraSLAM, run:
- Set up your Olympe development environment here. Using a virtual environment is highly recommanded here.
- Generate a parameter file of your Olympe parrot. We have prepared one if you want to use:
cd /root/colcon_ws/orb_slam3_ros2_wrapper/params
cp olympe.yaml.temp olympe.yaml
- Change the
*.yaml
parameter file you want to use inunirobot_mono.launch.py
- Launch ORB-SLAM3:
ros2 launch orb_slam3_ros2_wrapper unirobot_mono.launch.py
. You should see a window popup which is waiting for images. This is partially indicative of the setup correctly done. - Open another terminal and feed real-time drone captured images to ORB-SLAM3 through Ros2. Make sure you are in the virtual Olympe development environment
cd /root/olympe_dev
python ros2_streaming.py
Because the mono type assume the depth based on the computer version so if you find the orbslam3 warning of "not initialized" please shake your camera!
- Setup the ORB-SLAM3 ROS2 Docker using the steps above. Once you do (1) step in the
Launching ORB-SLAM3
section, you should see a window popup which is waiting for images. This is partially indicative of the setup correctly done. - Setup the simulation by following the README here
- Once you are able to teleop the robot, you should be able to run ORB-SLAM3 with both the containers (simulation and wrapper) running parallely.
The simulation and the wrapper both have their ROS_DOMAIN_ID
set to 55 so they are meant to work out of the box. However, you may face issues if this environment variable is not set properly. Before you start the wrapper, run ros2 topic list
and make sure the topics namespaced with scout_2
are visible inside the ORB-SLAM3 container provided the simulation is running along the side.