Turtlebot4 Processing Expectations #552
Replies: 2 comments 1 reply
-
@slowrunner have you disabled diagnostics on the turtlebot4? (through the setup tool). That can be part of the picture of decreasing CPU usage onboard. Also, CPU usage does actually change depending on your RMW selection so I would suggest exploring that as well if you are running into high CPU usage situations. |
Beta Was this translation helpful? Give feedback.
-
At this point - I am pleased with the Raspberry Pi 5 performance running TurtleBot4 localization and navigation on-board. Once given the starting pose, it successfully navigates to goal poses maintaining localization throughout the adventures in my home environment. At no time was it hindered by operating near the limits of the Raspberry Pi 5 processing capabilities. The message traffic from the RPi to the remote visualization platform allowed seeing the plans, the poses, and the camera preview at 30FPS without interruptions. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
One of the exceptional features of ROS is the ability to create a distributed node architecture, where the "robot" is a combination of a mobile sensor platform, and remote analysis, decision making, and control.
In fact, given the processing available to a Raspberry Pi 4 Turtlebot4, the official recommendation is to utilize remote processing for mapping, localization, and navigation tasks:
The goal for all the robots I build is to find ways to utilize the onboard processing power in ways that enable the robot to exist autonomously. Indeed, when mapping is performed in async mode with smooth, slow motion around my home the Raspberry Pi 4 can create high quality maps that fully support subsequent onboard localization and navigation.
I have since upgraded my robot TB5-WaLI to a Raspberry Pi 5, and find that localization and navigation over the turtlebot4.service and my WaLI nodes (wali_node, say_server, odometer) averages 75% of the Pi5 four core CPU with some brief moments at 100%.
Last year, brief experiments of onboard RTABmap processing showed similar "uses every cycle" with the need to move slowly, but suggested success will be possible by limiting the frame rate and robot motion appropriately [1]. Asking the Raspberry Pi processor to perform vision tasks without a strong GPU needs realistic expectations.
The 4TOPs neural-net processor in the Oak-D cameras provides object recognition along with remote video without taxing the Raspberry Pi in the least. Producing 3D point-clouds to feed RTABmap does introduce additional load on the Raspberry Pi, which will limit the frame rate at which autonomous vision processing can run.
In all, the TurtleBot4 platform represents a very flexible platform for autonomous or distributed architecture robots with appropriate expectations.
[1] With the republisher isolating the Create3 - this is a significant factor enabling onboard or remote RTABmap!
With diagnostics disabled, localization and navigation average 50-75% with brief maximums around 100%:
Beta Was this translation helpful? Give feedback.
All reactions