Replies: 2 comments
-
Maybe all data processing can happen outside the front-end? A single source can stitch together several source and stream it to a single frontend instance. There is: I use(d) both for live visualizations. |
Beta Was this translation helpful? Give feedback.
0 replies
-
Hi @nikodul regarding multi-mcap it is already implemented, so you can check out our releases. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone,
We (Intempora, editors of the RTMaps and IVS software tools) would need to be able to open several MCAP files simultaneously in a single LichtBlick instance and that the data from those MCAP files is fed into the dashboard in a synchronized manner as if it were a single file.
We could call this feature the "multi-source" capability.
The use case is as follows: consider we have a certain recording containing raw sensor data (and actually TeraBytes, and PetaBytes of such recordings).
This recording later on gets enriched with additional streams (e.g. ground truths streams, with 3D objects, segmented images, lane markings..., or just situational tags). When generated, these additional streams would be stored into a separate MCAP file.
For now we have to merge the streams in a single MCAP file which introduces a lot of overhead in the post-processing workflow (particularly when some of these additional streams, e.g. ground truth streams, are only available on a short sequence of the initial recording).
A next step in the multi-source approach would be to be able to stream data via websockets along with playing back an MCAP file.
Use case details:
A sensor data recording is stored in a format that is not MCAP (e.g. MDF4, RTMaps .rec, .dat, rosbag, ...)
We want to display the data in Lichtblick.
The signal streams (e.g. vehicle speed, accelerations, GPS trace etc.) are lightweight and can easily be exported to MCAP to constitue an "IterableSource" (where the full plot or full trajectory is loaded at file opening for displaying the data throughout the full recording).
The heavier streams (e.g. video and point clouds typically) may not need to be exported to the MCAP file and could be streamed directly from some software that reads the original format and streams the data to Lichtblick.
That would also save significant storage and post-processing resources as well.
This would additionally require the possibility to emit the time control commands (jumping in time, pausing, fast forward etc) to the said software via topics for instance so that the streaming source software has a chance to remain synched with the LichtBlick dashboard and feed the right data at the right time.
Summary of the complete feature request:
Do these features ring a bell to anyone?
May we mutualize efforts on this in a medium term? It may not be the most urgent features we have on the table but it probably requires some preparation, exchanges, training workshops maybe on the internal architecture of the tool.
Best regards,
Beta Was this translation helpful? Give feedback.
All reactions