PerReg+: Towards Generalizable Trajectory Prediction using Dual-Level Representation Learning and Adaptive Prompting
Official codebase for the CVPR 2025 paper:
"Towards Generalizable Trajectory Prediction using Dual-Level Representation Learning and Adaptive Prompting"
by Kaouther Messaoud, Matthieu Cord, and Alexandre Alahi.
PerReg+ is a novel transformer-based framework for vehicle trajectory prediction. It addresses key challenges in autonomous driving by:
- Generalizing across domains and datasets
- Enabling multimodal prediction without clustering or NMS
- Providing efficient domain adaptation via adaptive prompt tuning
- Dual-Level Representation Learning using Self-Distillation (SD) and Masked Reconstruction (MR)
- Register-based Queries for efficient and structured multimodal output
- Segment-Level Reconstruction of trajectories and lanes
- Adaptive Prompt Tuning for scalable and fast fine-tuning
PerReg+ achieves state-of-the-art results on the UniTraj Benchmark and shows strong cross-domain generalization.
- π CVPR 2025 Paper (PDF) (link will be added when available)
- π 6.8% reduction in Brier-minFDE on small datasets through SSL pretraining
- π 11.8% improvement in cross-domain generalization performance
- β‘ Prompt-based fine-tuning: adapts with minimal overhead
- π No clustering/NMS required for multimodal prediction
- π§ Richer scene understanding through segment-level reconstruction and dual-level supervision
Evaluated on:
Achievements:
- Outperforms AutoBot, MTR, and Forecast-MAE baselines
- Best B-FDE and minFDE across single-dataset and multi-dataset training
- Strong generalization to unseen domains
- π¦ Full code release (training & evaluation)
- π Pretrained model checkpoints
- π Detailed instructions and tutorials