v0.3.0
What's Changed
- Force float32 when loading transformers configs by @dakinggg in #11
- Torch 2.6 Version Bump by @abaheti95 in #13
- Preference RL refactor by @abaheti95 in #12
- Standardized the
sequence_id
batch variable to match llm-foundry by @abaheti95 in #14 - Standardized attention mask field in DPO, RM and finegrained preferences by @abaheti95 in #15
- Updating sequence length usage by @bcui-db in #17
- Separate inference engine by @bcui-db in #16
- Upper bound vllm by @dakinggg in #19
- Update setuptools version by @irenedea in #22
New Contributors
- @dakinggg made their first contribution in #11
- @abaheti95 made their first contribution in #13
- @irenedea made their first contribution in #22
Full Changelog: v0.2.1...v0.3.0