Docker container scripts for ROS2 development.
Contents of Docker image:
- Ubuntu 24.04 (Noble)
- ROS2 (Jazzy)
- Gazebo (Harmonic)
Additional content:
- MoveIt2
- built from source, for moveit_py package
- ASL Controller
- MyCobot-280 simulation
- https://github.com/automaticaddison/mycobot_ros2 (branch=jazzy)
- https://github.com/moveit/moveit_task_constructor (branch=jazzy)
- https://github.com/moveit/warehouse_ros_mongo (branch=ros2)
- Yahboom ROSMASTER-X3 simulation
- MOGI-ROS simulation
The repo must be cloned with git:
- git clone https://github.com/AlbertaBeef/robotics_docker
- cd robotics_docker
The pre-built Docker image can be accessed on Docker Hub:
If you prefer to build the Docker image locally,
Start by downloading the following archive to the "robotics_docker" directory:
- gazebo_models.zip Google Drive
Then launch the Docker build script as follows:
- cd robotics_docker/docker
- source ./build.sh
If not done so already, build Docker image, or download the image from Docker Hub:
- docker pull albertabeef/robotics_docker:latest
Create a directory for shared content, and indicate its location in the following file:
- robotics_docker/compose/docker-compose.yml
Launch Docker image using compose functionnality:
- cd robotics_docker/compose
- docker compose up -d robotics_demo
Making the local host's GUI available to the Docker image:
- xhost +
- docker compose exec robotics_demo bash
Launch the asl_controller_twist node with usbcam_publisher and turtlesim nodes:
- ros2 launch asl_mediapipe_pointnet demo01_turtlesim_asl_part1.launch.py | ros2 launch asl_mediapipe_pointnet demo01_turtlesim_asl_part2.launch.py
Launch the hand_controller_asl_twist node with usbcam_publisher and turtlesim nodes:
- ros2 launch hand_controller demo01_turtlesim_part1_asl.launch.py | ros2 launch hand_controller demo01_turtlesim_part2.launch.py
Control Turtle with Hand Signs
- A : Advance
- B : Backup
- L : Turn Left
- R : Turn Right
Launch the asl_controller_twist node with the MOGI-ROS wheeled vehicle:
- ros2 launch asl_mediapipe_pointnet demo11_mogiros_car_part1_asl.launch.py | ros2 launch asl_mediapipe_pointnet demo11_mogiros_car_part2.launch.py
Launch the hand_controller_asl_twist node with the MOGI-ROS wheeled vehicle:
- ros2 launch hand_controller demo11_mogiros_car_part1_asl.launch.py | ros2 launch hand_controller demo11_mogiros_car_part2.launch.py
Control Vehicle with Hand Signs
- A : Advance
- B : Backup
- L : Turn Left
- R : Turn Right
Launch the asl_controller_twist node with ROSMASTER-X3 vehicle:
- ros2 launch asl_mediapipe_pointnet demo12_rosmaster_part1_asl.launch.py | ros2 launch asl_mediapipe_pointnet demo12_rosmaster_part2.launch.py
Launch the hand_controller_asl_twist node with ROSMASTER-X3 vehicle:
- ros2 launch hand_controller demo12_rosmaster_part1_asl.launch.py | ros2 launch hand_controller demo12_rosmaster_part2.launch.py
Control Vehicle with Hand Signs
- A : Advance
- B : Backup
- L : Turn Left
- R : Turn Right
Launch the asl_controller_pose node with MOGI-ROS simple robotic arm:
- ros2 launch asl_mediapipe_pointnet demo21_mogiros_arm_part1_asl.launch.py | ros2 launch asl_mediapipe_pointnet demo21_mogiros_arm_part2.launch.py
Launch the hand_controller_asl_pose node with MOGI-ROS simple robotic arm:
- ros2 launch hand_controller demo21_mogiros_arm_part1_asl.launch.py | ros2 launch hand_controller demo21_mogiros_arm_part2.launch.py
Control Robotic Arm with Left/Right Hands:
-
Left Hand
- L : Turn Arm Left
- R : Turn Arm Right
- A : Advance Arm (shoulder joint)
- B : Backup Arm (shoulder joint)
- U : Lift Arm (elbow joint)
- Y : Lower Arm (elbow joint)
-
Right Hand
- A : Close Gripper
- B : Open Gripper
Launch the asl_controller_pose node with MYCOBOT-280 robotic arm:
- moveit &
- ros2 launch asl_mediapipe_pointnet demo31_mycobot_part1_asl.launch.py | ros2 launch hand_controller demo31_mycobot_part2.launch.py
Launch the hand_controller_asl_pose node with MYCOBOT-280 robotic arm:
- moveit &
- ros2 launch hand_controller demo31_mycobot_part1_asl.launch.py | ros2 launch hand_controller demo31_mycobot_part2.launch.py
Control Robotic Arm with Hand Signs
- L : Move Left
- R : Move Right
- A : Move Forward
- B : Move Backward
- U : Move Up
- Y : Move Down
The Complete Guide to Docker for ROS 2 Jazzy Projects
Automatic Addison on-line Tutorials:
- https://automaticaddison.com/tutorials
- https://github.com/automaticaddison/mycobot_ros2 (branch=jazzy)
- https://github.com/automaticaddison/yahboom_rosmaster
MOGI-ROS on-line Tutorials:
- https://github.com/MOGI-ROS/Week-1-2-Introduction-to-ROS2
- https://github.com/MOGI-ROS/Week-3-4-Gazebo-basics
- https://github.com/MOGI-ROS/Week-5-6-Gazebo-sensors
- https://github.com/MOGI-ROS/Week-7-8-ROS2-Navigation
- https://github.com/MOGI-ROS/Week-9-10-Simple-arm
Accelerating MediaPipe:
- Hackster Series Part 1 Blazing Fast Models
- Hackster Series Part 2 Insightful Datasets for ASL recognition
- Hackster Series Part 3 Accelerating the MediaPipe models with Vitis-AI 3.5
- Hackster Series Part 4 Accelerating the MediaPipe models with Hailo-8
- Hackster Series Part 5 Accelerating the MediaPipe models on RPI5 AI Kit
- Hackster Series Part 6 Accelerating the MediaPipe models with MemryX
- Blaze Utility (python version) : blaze_app_python
- Blaze Utility (C++ version) : blaze_app_cpp
ASL Recognition using PointNet (by Edward Roe):
- Medium Article ASL Recognition using PointNet and MediaPipe
- Kaggle Dataset American Sign Language Dataset
- GitHub Source pointnet_hands