Demo workspace for:
- https://github.com/AIT-Assistive-Autonomous-Systems/led_strip_hmi
- https://github.com/AIT-Assistive-Autonomous-Systems/led_strip_hmi_embedded
ws_led_strip_hmi is a ROS 2 workspace demonstrating how to map 3D perception data (e.g., human poses or laser scans) onto configurable virtual LED-strip layouts. This enables robots to convey situational awareness via physical LEDs or RViz markers, improving safety and transparency around autonomous platforms.
-
dummy_detection_publisher
- Publishes mock 3D detections of persons (using poses, bounding boxes) on
/detections
to test downstream nodes. - Configurable via
persons.yml
(src/dummy_detection_publisher/cfg/persons.yml
).
- Publishes mock 3D detections of persons (using poses, bounding boxes) on
-
led_strip_hmi_msgs
-
Defines custom ROS 2 messages driving the HMI:
LEDStrip.msg
,LEDStripConfig.msg
,LEDStripEffect.msg
- Projection info types (
LEDStripProjectionInfo.msg
,VirtualLEDStripSegment.msg
, etc.)
-
-
led_strip_hmi_common
-
Shared utilities for parsing YAML configs, handling coordinate transforms, and modeling virtual LED strips:
config_utils.py
,config.py
– YAML parsing & validationtf_utils.py
– helper functions for TF transformsvirtual_strip.py
– abstraction of LED positions, densities, and indexing
-
-
led_strip_hmi_visualization
strip_visualizer
node publishesMarkerArray
to RViz, rendering static LED outlines and dynamic highlights.- Configurable via
src/led_strip_hmi_visualization/config/strips.yaml
.
-
led_strip_hmi_projector
projector_node
subscribes to/detections
or sensor topics (e.g.,/scan
) and computes which LEDs should light up.- Publishes normalized LED indices on
/led_indices
and a debug image on/debug/image
. - Utilizes
drawing.py
and adapter modules to simulate projection onto real surfaces.
graph LR
subgraph Dummy Detections
D[dummy_detection_publisher\ndummy_publisher]
end
subgraph Projection
P[led_strip_hmi_projector\nprojector_node]
end
subgraph Visualization
V[led_strip_hmi_visualization\nstrip_visualizer]
end
RV[RViz\nrviz2]
D -->|/detections| P
P -->|/led_indices| V
P -->|/debug/image| RV
V -->|MarkerArray| RV
P -->|tf frames| RV
- ROS 2 (Humble Hawksbill or later)
- Python 3.8+ with dependencies: PyYAML, numpy, opencv-python
- vcstool for managing repos
- Docker & VS Code Remote - Containers (optional, for reproducible dev env)
- X11 forwarding or local display for RViz GUI
# Clone & initialize
git clone https://github.com/AIT-Assistive-Autonomous-Systems/led_strip_hmi_demo_workspace.git
cd led_strip_hmi_demo_workspace
vcs import src/ < src/ros2.repos
=> Re-open in VSCode devcontainer
# In container:
# Install dependencies
rosdep update
rosdep install --from-paths src --ignore-src -r -y
# Build the workspace
./build.sh
# or
# Ctrl+Shift+B
# Source the overlay
source install/setup.bash
Launch one of the provided scenarios:
Demo | Launch File | Config File | Description |
---|---|---|---|
A200 | ros2 launch led_strip_hmi_demos a200.launch.py |
a200.yaml |
Clearpath Husky A200 robot simulation |
Caripu | ros2 launch led_strip_hmi_demos complex_strip.launch.py |
robot.yaml |
Mock-person detections + RViz viz + projector on robot model |
LaserScan | ros2 launch led_strip_hmi_demos laserscan.launch.py |
laserscan.yaml |
Projects live or recorded LaserScan data |
Running e.g., the Husky A200 demo, you should see a person passing by the robot. The Husky is equipped with LEDs to signal that the robot "sees" the human and indicates this by lighting them up in the direction of perception.
-
LED strips:
- Defines strip polygons, number of LEDs, frames, ordering, and perception parameters.
src/led_strip_hmi_demos/config/a200.yaml
src/led_strip_hmi_demos/config/complex_strip.yaml
-
Persons:
- Specifies start/end points, velocities, and dimensions for each dummy person.
src/dummy_detection_publisher/cfg/persons.yml
Feel free to edit these YAMLs to test new layouts or detection scenarios.
Run all unit & integration tests:
./test.sh
Or test a single package:
./test_single_pkg.sh <package_name>
Contributions are welcome! Please:
- Fork the repo and create a feature branch
- Follow code style (flake8, black, pep257)
- Write tests for new functionality
- Submit a merge request with clear description
This project is licensed under the Apache License 2.0. See LICENSE for details.