Skip to content

vita-epfl/PerReg

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

3 Commits
Β 
Β 

Repository files navigation

PerReg+: Towards Generalizable Trajectory Prediction using Dual-Level Representation Learning and Adaptive Prompting

Official codebase for the CVPR 2025 paper:
"Towards Generalizable Trajectory Prediction using Dual-Level Representation Learning and Adaptive Prompting"
by Kaouther Messaoud, Matthieu Cord, and Alexandre Alahi.


Table of Contents


Overview

PerReg+ is a novel transformer-based framework for vehicle trajectory prediction. It addresses key challenges in autonomous driving by:

  • Generalizing across domains and datasets
  • Enabling multimodal prediction without clustering or NMS
  • Providing efficient domain adaptation via adaptive prompt tuning

Key Innovations:

  • Dual-Level Representation Learning using Self-Distillation (SD) and Masked Reconstruction (MR)
  • Register-based Queries for efficient and structured multimodal output
  • Segment-Level Reconstruction of trajectories and lanes
  • Adaptive Prompt Tuning for scalable and fast fine-tuning

PerReg+ achieves state-of-the-art results on the UniTraj Benchmark and shows strong cross-domain generalization.


Paper


Highlights

  • πŸ“‰ 6.8% reduction in Brier-minFDE on small datasets through SSL pretraining
  • πŸ”„ 11.8% improvement in cross-domain generalization performance
  • ⚑ Prompt-based fine-tuning: adapts with minimal overhead
  • πŸ›  No clustering/NMS required for multimodal prediction
  • 🧠 Richer scene understanding through segment-level reconstruction and dual-level supervision

Benchmarks

Evaluated on:

Achievements:

  • Outperforms AutoBot, MTR, and Forecast-MAE baselines
  • Best B-FDE and minFDE across single-dataset and multi-dataset training
  • Strong generalization to unseen domains

Coming Soon

  • πŸ“¦ Full code release (training & evaluation)
  • πŸ† Pretrained model checkpoints
  • πŸ“š Detailed instructions and tutorials

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published