Skip to content
This repository was archived by the owner on Jun 3, 2025. It is now read-only.

Commit 4f7ba12

Browse files
authored
Transformers doc fixes (#1578)
1 parent 494cd72 commit 4f7ba12

15 files changed

+20
-22
lines changed

integrations/huggingface-transformers/README.md

Lines changed: 6 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -29,11 +29,9 @@ Once trained, SparseML enables you to export models to the ONNX format, such tha
2929
Install with `pip`:
3030

3131
```bash
32-
pip install sparseml[torch]
32+
pip install sparseml[transformers]
3333
```
3434

35-
**Note**: Transformers will not immediately install with this command. Instead, a sparsification-compatible version of Transformers will install on the first invocation of the Transformers code in SparseML.
36-
3735
## **Tutorials**
3836

3937
- [Sparse Transfer Learning with the Python API](tutorials/sparse-transfer-learning-bert-python.md) [**RECOMMENDED**]
@@ -50,13 +48,13 @@ pip install sparseml[torch]
5048
### **Use Case Examples - Python**
5149

5250
- [Sparse Transfer with GLUE Datasets (SST2) for sentiment analysis](tutorials/sentiment-analysis/docs-sentiment-analysis-python-sst2.ipynb)
53-
- [Sparse Transfer with Custom Datasets (RottenTomatoes) and Custom Teacher from HF Hub for sentiment analysis](tutorials/sentiment-analysis/docs-sentiment-analysis-python-custom-teacher-rottentomatoes)
51+
- [Sparse Transfer with Custom Datasets (RottenTomatoes) and Custom Teacher from HF Hub for sentiment analysis](tutorials/sentiment-analysis/docs-sentiment-analysis-python-custom-teacher-rottentomatoes.ipynb)
5452
- [Sparse Transfer with GLUE Datasets (QQP) for multi-input text classification](tutorials/text-classification/docs-text-classification-python-qqp.ipynb)
5553
- [Sparse Transfer with Custom Datasets (SICK) for multi-input text classification](tutorials/text-classification/docs-text-classification-python-sick.ipynb)
5654
- [Sparse Transfer with Custom Datasets (TweetEval) and Custom Teacher for single input text classificaiton](tutorials/text-classification/docs-text-classification-python-custom-teacher-tweeteval.ipynb)
5755
- [Sparse Transfer with Custom Datasets (GoEmotions) for multi-label text classification](tutorials/text-classification/docs-text-classification-python-multi-label-go_emotions.ipynb)
5856
- [Sparse Transfer with Conll2003 for named entity recognition](tutorials/token-classification/docs-token-classification-python-conll2003.ipynb)
59-
- [Sparse Transfer with Custom Datasets (WNUT) and Custom Teacher for named entity recognition](tutorials/token-classification/docs-token-classification-custom-teacher-wnut.ipynb)
57+
- [Sparse Transfer with Custom Datasets (WNUT) and Custom Teacher for named entity recognition](tutorials/token-classification/docs-token-classification-python-custom-teacher-wnut.ipynb)
6058
- Sparse Transfer with SQuAD (example coming soon!)
6159
- Sparse Transfer with Squadshifts Amazon (example coming soon!)
6260

@@ -66,7 +64,7 @@ pip install sparseml[torch]
6664

6765
SparseZoo is an open-source repository of pre-sparsified models, including BERT-base, BERT-large, RoBERTa-base, RoBERTa-large, and DistillBERT. With SparseML, you can fine-tune these pre-sparsified checkpoints onto custom datasets (while maintaining sparsity) via sparse transfer learning. This makes training inference-optimized sparse models almost identical to your typical training workflows!
6866

69-
[Check out the available models](https://sparsezoo.neuralmagic.com/?repo=huggingface&page=1)
67+
[Check out the available models](https://sparsezoo.neuralmagic.com/?repos=huggingface)
7068

7169
### **Recipes**
7270

@@ -140,15 +138,15 @@ Currently supported tasks include:
140138

141139
Sparse Transfer is very similiar to the typical transfer learing process used to train NLP models, where we fine-tune a checkpoint pretrained on a large upstream dataset using masked language modeling onto a smaller downstream dataset. With Sparse Transfer Learning, however, we simply start the fine-tuning process from a pre-sparsified checkpoint and maintain sparsity while the training process occurs.
142140

143-
Here, we will fine-tune a [90% pruned version of BERT](https://sparsezoo.neuralmagic.com/models/nlp%2Fmasked_language_modeling%2Fobert-base%2Fpytorch%2Fhuggingface%2Fwikipedia_bookcorpus%2Fpruned90-none) from the SparseZoo onto SST2.
141+
Here, we will fine-tune a [90% pruned version of BERT](https://sparsezoo.neuralmagic.com/models/obert-base-wikipedia_bookcorpus-pruned90?comparison=obert-base-wikipedia_bookcorpus-base) from the SparseZoo onto SST2.
144142

145143
### **Kick off Training**
146144

147145
We will use SparseML's `sparseml.transformers.text_classification` training script.
148146

149147
To run sparse transfer learning, we first need to create/select a sparsification recipe. For sparse transfer, we need a recipe that instructs SparseML to maintain sparsity during training and to quantize the model.
150148

151-
For the SST2 dataset, there is a [transfer learning recipe available in SparseZoo](https://sparsezoo.neuralmagic.com/models/nlp%2Fsentiment_analysis%2Fobert-base%2Fpytorch%2Fhuggingface%2Fsst2%2Fpruned90_quant-none), identified by the following SparseZoo stub:
149+
For the SST2 dataset, there is a [transfer learning recipe available in SparseZoo](https://sparsezoo.neuralmagic.com/models/obert-base-sst2_wikipedia_bookcorpus-pruned90_quantized?comparison=obert-base-sst2_wikipedia_bookcorpus-base&tab=0), identified by the following SparseZoo stub:
152150
```
153151
zoo:nlp/sentiment_analysis/obert-base/pytorch/huggingface/sst2/pruned90_quant-none
154152
```

integrations/huggingface-transformers/tutorials/question-answering/question-answering-cli.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@ In this tutorial, you will learn how to:
4141
Install SparseML via `pip`:
4242

4343
```bash
44-
pip install sparseml[torch]
44+
pip install sparseml[transformers]
4545
```
4646

4747
## **Sparse Transfer Learning onto SQuAD**

integrations/huggingface-transformers/tutorials/sentiment-analysis/docs-sentiment-analysis-python-custom-teacher-rottentomatoes.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -33,7 +33,7 @@
3333
},
3434
"outputs": [],
3535
"source": [
36-
"!pip install sparseml[torch]"
36+
"!pip install sparseml[transformers]"
3737
]
3838
},
3939
{

integrations/huggingface-transformers/tutorials/sentiment-analysis/docs-sentiment-analysis-python-sst2.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@
4545
},
4646
"outputs": [],
4747
"source": [
48-
"!pip install sparseml[torch]"
48+
"!pip install sparseml[transformers]"
4949
]
5050
},
5151
{

integrations/huggingface-transformers/tutorials/sentiment-analysis/sentiment-analysis-cli.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@ In this tutorial, you will learn how to:
4141
Install SparseML via `pip`:
4242

4343
```bash
44-
pip install sparseml[torch]
44+
pip install sparseml[transformers]
4545
```
4646

4747
## Sparse Transfer Learning onto SST2 (GLUE Task)

integrations/huggingface-transformers/tutorials/sparse-transfer-learning-bert-python.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ SparseZoo contains pre-sparsified checkpoints of common NLP models like BERT and
3131
Install via `pip`:
3232

3333
```
34-
pip install sparseml[torch]
34+
pip install sparseml[transformers]
3535
```
3636

3737
## **Sparse Transfer Learning onto SST2**

integrations/huggingface-transformers/tutorials/sparse-transfer-learning-bert.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -31,7 +31,7 @@ SparseZoo contains pre-sparsified checkpoints of common NLP models like BERT and
3131
Install via `pip`:
3232

3333
```
34-
pip install sparseml[torch]
34+
pip install sparseml[transformers]
3535
```
3636

3737
## **Example: Sparse Transfer Learning onto SST2**

integrations/huggingface-transformers/tutorials/text-classification/docs-text-classification-python-custom-teacher-tweeteval.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@
4545
},
4646
"outputs": [],
4747
"source": [
48-
"!pip install sparseml[torch]"
48+
"!pip install sparseml[transformers]"
4949
]
5050
},
5151
{

integrations/huggingface-transformers/tutorials/text-classification/docs-text-classification-python-multi-label-go_emotions.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@
5454
},
5555
"outputs": [],
5656
"source": [
57-
"!pip install sparseml[torch]"
57+
"!pip install sparseml[transformers]"
5858
]
5959
},
6060
{

integrations/huggingface-transformers/tutorials/text-classification/docs-text-classification-python-qqp.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@
4545
},
4646
"outputs": [],
4747
"source": [
48-
"!pip install sparseml[torch]"
48+
"!pip install sparseml[transformers]"
4949
]
5050
},
5151
{

0 commit comments

Comments
 (0)