-
Notifications
You must be signed in to change notification settings - Fork 51
Description
Hi, I wonder how the trajectory can be projected onto the front camera view. I know same visualization is possible in BEV case in the visualization tool you have created (add_trajectory_bev).
I was trying to do it, utilizing the extrinsic and intrinsic params of the camera in vain.
I somehow reached how the sensor_to_lidar (camera to lidar) extrinsic params can be extracted but unable to obtain local or global to camera extrinsic params as the trajectory output is the local ego coordinate.
In other words, convert local ego trajectories into the front camera coordinate and then project it onto the camera image plane using intrinsic params.
Addtional question:
Is there any approach that selectively accesses a certain type of scenario or scene such as challenging scenarios to evaluate how well the model navigates them.
Your advice would be very helpful!