Replies: 4 comments 1 reply
-
I think this would be a great addition to tesseract. |
Beta Was this translation helpful? Give feedback.
-
Someone translated Three.js to C++: https://github.com/markaren/threepp The demos are pretty impressive and I think we can potentially use this for 3D rendering. I already have experience using the JS version so I know the API. |
Beta Was this translation helpful? Give feedback.
-
@Levi-Armstrong is raycasting exposed through the collision interface? We will need it to implement lidar and depth sensors. |
Beta Was this translation helpful? Give feedback.
-
It depends on your requirements. For simulating rendering sensors like cameras, lidars, etc, gz-sensors pulls in gz-rendering for creating the 3D scene , and that uses ogre-next as the underlying render engine. It's not trivial to add / swap to a different engine like threepp / threejs. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I frequently need to simulate sensors such as cameras, lidar, point cloud, and depth cameras. Currently I am using Gazebo to simulate sensors, but this is overkill and adds a lot of unnecessary complexity. The Tesseract environment has enough information to simulate these sensors. We added textures and advanced materials to the mesh models a while ago so we can potentially simulate cameras fairly well. The other sensor types are based on collision detection rays. Is there any interest in a
tesseract_sensors
package that implements some of these sensors?@iche033 would it be reasonable to use the
gz-sensors
package to simulate sensors using the Tesseract scene information? The relevant data types are found in https://github.com/tesseract-robotics/tesseract/tree/master/tesseract_scene_graph and https://github.com/tesseract-robotics/tesseract/tree/master/tesseract_environmentBeta Was this translation helpful? Give feedback.
All reactions