Description
Motivation
D4RLExperienceReplay is currently used in torchrl
to load offline datasets for algorithms like CQL. However, the maintainers of D4RL have officially deprecated this dataset source, stating:
"All offline datasets in D4RL have been moved to Minari, and all online environments have been transferred to Gymnasium, MiniGrid, and Gymnasium-Robotics."
To stay aligned with this change and maintain long-term compatibility, torchrl should migrate from D4RLExperienceReplay
to MinariExperienceReplay
.
Solution
Update offline RL training scripts to replace D4RLExperienceReplay
with MinariExperienceReplay
. This ensures the project continues to work with actively maintained offline RL datasets and avoids future breakage.
Alternatives
- Continue using
D4RLExperienceReplay
temporarily, but this will eventually lead to broken compatibility as D4RL is no longer maintained. - Implement a transitional wrapper that abstracts over both replay sources (less ideal given clear deprecation).
Additional context
This request is backed by the official communication in the Farama Foundation's D4RL GitHub repository and their migration plan.
The initial work has already been implemented for CQL in a related PR. Further migration can follow in subsequent steps for other offline RL agents.
Checklist
- I have checked that there is no similar issue in the repo (required)