Skip to content

ROS2 multi-drone PX4, ArduPilot control interfaces for quads and VTOLs, with YOLO, LiDAR, Dockerized simulation, and JetPack deployment

License

Notifications You must be signed in to change notification settings

JacopoPan/aerial-autonomy-stack

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

aerial-autonomy-stack

Aerial autonomy stack (AAS) is a software stack to:

  1. Develop end-to-end drone autonomy with ROS2
  2. Simulate vision and control in software-in-the-loop, with YOLOv8 and PX4/ArduPilot
  3. Deploy in real drones with NVIDIA Orin/JetPack

For the motivation behind AAS and how it compares to similar projects, read RATIONALE.md

aerial-autonomy-stack-v2.mp4

Features

AAS leverages the following frameworks: (expand)

ROS2 Humble (LTS, EOL 5/2027), Gazebo Sim Harmonic (LTS, EOL 9/2028), PX4 1.16 interfaced via XRCE-DDS, ArduPilot 4.6 interfaced via MAVROS, YOLOv8 on ONNX Runtime 1.22 (latest stable releases as of 8/2025), L4T 36 (Ubuntu 22-based)/JetPack 6 (for deployment only, latest major release as of 8/2025)


Part 1: Installation of AAS

Important

This stack is developed and tested using a Ubuntu 22.04 host (penultimate LTS, ESM 4/2032) with nvidia-driver-575 and Docker Engine v28 (latest stable releases as of 7/2025) on an i9-13 with RTX3500 and an i7-11 with RTX3060—note that an NVIDIA GPU is required

To setup the requirements: (i) Ubuntu 22, Git LFS, (ii) NVIDIA driver, (iii) Docker Engine, (iv) NVIDIA Container Toolkit, and (v) NVIDIA NGC API Key, read PREINSTALL.md

# Clone this repo
mkdir -p ~/git
git clone git@github.com:JacopoPan/aerial-autonomy-stack.git ~/git/aerial-autonomy-stack
cd ~/git/aerial-autonomy-stack

Build the Docker Images

Warning

The build script creates two ~20GB images (including lots of tools and artifacts for development)

Building from scratch requires a good/stable internet connection (Ctrl + c and restart if necessary)

# Clone external repos (in github_clones/) and build the Docker images
cd ~/git/aerial-autonomy-stack/scripts
./sim_build.sh # The first build takes ~25', subsequent ones take seconds to minutes

Part 2: Simulation and Development with AAS

# Start a simulation (note: ArduPilot STIL takes ~40s to be ready to arm)
cd ~/git/aerial-autonomy-stack/scripts
AUTOPILOT=px4 NUM_QUADS=1 NUM_VTOLS=1 WORLD=swiss_town ./sim_run.sh # Check the script for more options

On a low-mid range laptop—i7-11 with 16GB RAM and RTX3060—AAS simulates 3 PX4 quads with camera and LiDAR at 99% of the wall-clock (note that ArduPilot faster physics updates and more complex worlds have higher computational demands)

Once "Ready to Fly", one can takeoff and control from QGroundControl's "Fly View"

worlds

Available WORLDs:

  • apple_orchard, a GIS world created using BlenderGIS
  • impalpable_greyness, (default) an empty world with simple shapes
  • shibuya_crossing, a 3D world adapted from cgtrader
  • swiss_town, a photogrammetry world courtesy of Pix4D / pix4d.com

To advance the simulation in discrete time steps, e.g. 1s, from a terminal on the host, run:

docker exec simulation-container bash -c "gz service -s /world/\$WORLD/control --reqtype gz.msgs.WorldControl --reptype gz.msgs.Boolean --req 'multi_step: 250, pause: true'" # Adjust multi_step based on the value of max_step_size in the world's .sdf (defaults: 250 for PX4, 1000 for ArduPilot)

To add or disable a wind field, from a terminal on the host, run:

docker exec simulation-container bash -c "gz topic -t /world/\$WORLD/wind/ -m gz.msgs.Wind  -p 'linear_velocity: {x: 0.0 y: 3.0}, enable_wind: true'" # Positive X blows from the West, positive Y blows from the South

docker exec simulation-container bash -c "gz topic -t /world/\$WORLD/wind/ -m gz.msgs.Wind  -p 'enable_wind: false'" # Disable WindEffects

Tip

Tmux and Docker Shortcuts (expand)
  • Move between Tmux windows with Ctrl + b, then n, p
  • Move between Tmux panes with Ctrl + b, then arrow keys
  • Enter copy mode to scroll back with Ctrl + [, then arrow keys, exit with q
  • Split a Tmux window with Ctrl + b, then " (horizontal) or % (vertical)
  • Detach Tmux with Ctrl + b, then d
tmux list-sessions # List all sessions
tmux attach-session -t [session_name] # Reattach a session
tmux kill-session -t [session_name] # Kill a session
tmux kill-server # Kill all sessions

Docker hygiene:

docker ps -a # List containers
docker stop $(docker ps -q) # Stop all containers
docker container prune # Remove all stopped containers

docker images # List images
docker image prune # Remove untagged images
docker rmi <image_name_or_id> # Remove a specific image
docker builder prune # Clear the cache system wide

Fly a Mission

cd ~/git/aerial-autonomy-stack/scripts
AUTOPILOT=px4 NUM_QUADS=1 ./sim_run.sh # Also try AUTOPILOT=ardupilot, or NUM_QUADS=0 NUM_VTOLS=1

# In aircraft 1's terminal
ros2 run mission mission --conops yalla --ros-args -r __ns:=/Drone$DRONE_ID -p use_sim_time:=true # This mission is a simple takeoff, followed by an orbit, and landing for any vehicle

# Finally, in the simulation's terminal
/simulation_resources/patches/plot_logs.sh # Analyze the flight logs

Command Line Interface

Read the banner comment in the autopilot_interface headers for command line examples (takeoff, orbit, reposition, offboard, land):

Once flown from CLI, implemented your mission in MissionNode.conops_callback()

Development

Launching the sim_run.sh script with MODE=dev, does not start the simulation and mounts folders simulation_resources, aircraft_resources, and ros2_ws/src as volumes to more easily track, commit, push changes while building and testing them within the containers

# Develop within live containers
cd ~/git/aerial-autonomy-stack/scripts
MODE=dev ./sim_run.sh # Images are pre-built but the ros2_ws/src/ and *_resources/ folders are mounted from the host

Note

Project Structure

aerial-autonomy-stack
│
├── aircraft
│   ├── aircraft_ws
│   │   └── src
│   │       ├── autopilot_interface # Ardupilot/PX4 high-level actions (Takeoff, Orbit, Offboard, Land)
│   │       ├── mission             # Orchestrator of the actions in `autopilot_interface` 
│   │       ├── offboard_control    # Low-level references for the Offboard action in `autopilot_interface` 
│   │       ├── state_sharing       # Publisher of the `/state_sharing_drone_N` topic broadcasted by Zenoh
│   │       └── yolo_inference      # GStreamer video acquisition and publisher of YOLO bounding boxes
│   │
│   └── aircraft.yml.erb            # Aircraft docker tmux entrypoint
│
├── scripts
│   ├── docker
│   │   ├── Dockerfile.aircraft     # Docker image for aircraft simulation and deployment
│   │   └── Dockerfile.simulation   # Docker image for Gazebo and SITL simulation
│   │
│   ├── deploy_build.sh             # Build `Dockerfile.aircraft` for arm64/Orin
│   ├── deploy_run.sh               # Start the aircraft docker on arm64/Orin
│   │
│   ├── sim_build.sh                # Build both dockerfiles for amd64/simulation
│   └── sim_run.sh                  # Start the simulation
│
└── simulation
    ├── simulation_resources
    │   ├── aircraft_models
    │   │   ├── alti_transition_quad # ArduPilot VTOL
    │   │   ├── iris_with_ardupilot  # ArduPilot quad
    │   │   ├── sensor_camera
    │   │   ├── sensor_lidar
    │   │   ├── standard_vtol        # PX4 VTOL
    │   │   └── x500                 # PX4 quad
    │   └── simulation_worlds
    │       ├── apple_orchard.sdf
    │       ├── impalpable_greyness.sdf
    │       ├── shibuya_crossing.sdf
    │       └── swiss_town.sdf
    │
    ├── simulation_ws
    │   └── src
    │       └── ground_system        # Publisher of topic `/tracks` broadcasted by Zenoh
    │
    └── simulation.yml.erb           # Simulation docker tmux entrypoint

Part 3: Deployment of AAS

Important

These instructions are tested on a Holybro Jetson Baseboard kit that includes (i) a Pixhawk 6X autopilot and (ii) an NVIDIA Orin NX 16GB computer connected via both serial and ethernet

To setup (i) PX4's DDS UDP client, (ii) ArduPilot serial MAVLink bridge, (iii) JetPack 6, (iv) Docker Engine, (v) NVIDIA Container Toolkit, and (vi) NVIDIA NGC API Key on Orin, read AVIONICS.md

The Holybro Jetson Baseboard comes with an (i) integrated 4-way (Orin, 6X, RJ-45, JST) Ethernet switch and (ii) two JST USB 2.0 that can be connected to ASIX Ethernet adapters to create additional network interfaces

Make sure to configure Orin, 6X's XRCE-DDS, IP radio, Zenoh, etc. consistently with your network setup; the camera acquisition pipeline should be setup in yolo_inference_node.py, the LiDAR should publish on topic /lidar_points for KISS-ICP (if necessary, discuss in the Issues)

# On Jetson Orin NX, build for arm64 with TensorRT support
mkdir -p ~/git
git clone git@github.com:JacopoPan/aerial-autonomy-stack.git ~/git/aerial-autonomy-stack
cd ~/git/aerial-autonomy-stack/scripts
./deploy_build.sh # The first build takes ~1h (mostly to build onnxruntime-gpu from source)
# On Jetson Orin NX, start and attach the aerial-autonomy-stack (e.g., from ssh)
DRONE_TYPE=quad AUTOPILOT=px4 DRONE_ID=1 CAMERA=true LIDAR=false ./deploy_run.sh
docker exec -it aircraft-container tmux attach

You've done a man's job, sir. I guess you're through, huh?