You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+13-2Lines changed: 13 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -3,8 +3,13 @@
3
3
This is the implementation for [Full Parameter Fine-Tuning for Large Language Models with Limited Resources](https://arxiv.org/pdf/2306.09782.pdf)
4
4
and [AdaLomo: Low-memory Optimization with Adaptive Learning Rate](https://arxiv.org/pdf/2310.10195.pdf).
5
5
6
-
LOMO and AdaLomo are integrated in [CoLLiE](https://github.com/OpenMOSS/collie) library, which supports Collaborative Training of Large Language Models in an Efficient Way.
7
-
You can also install `lomo-optim` from PyPI using pip.
6
+
# News
7
+
- LOMO and AdaLomo were integrated in [`transformers`](https://huggingface.co/docs/transformers/main/en/trainer#lomo-optimizer) and [`accelerate`](https://huggingface.co/docs/accelerate/main/en/package_reference/accelerator#accelerate.Accelerator.lomo_backward).
8
+
- PyPI package `lomo-optim` was released.
9
+
- LOMO and AdaLomo were integrated in [`CoLLiE`](https://github.com/OpenMOSS/collie) library, which supports Collaborative Training of Large Language Models in an Efficient Way.
10
+
11
+
# Usage
12
+
You can install `lomo-optim` from PyPI using pip.
8
13
9
14
```bash
10
15
pip install lomo-optim
@@ -54,4 +59,10 @@ The code for AdaLomo is in [adalomo](adalomo) folder.
54
59
journal={arXiv preprint arXiv:2306.09782},
55
60
year={2023}
56
61
}
62
+
@article{lv2023adalomo,
63
+
title={AdaLomo: Low-memory Optimization with Adaptive Learning Rate},
64
+
author={Lv, Kai and Yan, Hang and Guo, Qipeng and Lv, Haijun and Qiu, Xipeng},
论文 [Full Parameter Fine-Tuning for Large Language Models with Limited Resources](https://arxiv.org/pdf/2306.09782.pdf) 和 [AdaLomo: Low-memory Optimization with Adaptive Learning Rate](https://arxiv.org/pdf/2310.10195.pdf) 的实现.
6
4
7
-
LOMO和AdaLomo已经集成到了 [CoLLiE](https://github.com/OpenLMLab/collie) (Collaborative Training of Large Language Models in an Efficient Way) 中。
0 commit comments