Skip to content
This repository was archived by the owner on Jun 3, 2025. It is now read-only.

Commit 9752f6a

Browse files
authored
YOLOv5 doc fixes (#1574)
1 parent 544e5b5 commit 9752f6a

File tree

1 file changed

+3
-5
lines changed

1 file changed

+3
-5
lines changed

integrations/ultralytics-yolov5/README.md

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -29,11 +29,9 @@ Once trained, SparseML enables you to export models to the ONNX format, such tha
2929
Install with `pip`:
3030

3131
```bash
32-
pip install sparseml[torchvision]
32+
pip install sparseml[yolov5]
3333
```
3434

35-
**Note**: YOLOv5 will not immediately install with this command. Instead, a sparsification-compatible version of YOLOv5 will install on the first invocation of the YOLOv5 code in SparseML.
36-
3735
## Tutorials
3836

3937
- [Sparse Transfer Learning with the CLI](tutorials/sparse-transfer-learning.md) **[HIGHLY RECOMMENDED]**
@@ -91,15 +89,15 @@ SparseML inherits most arguments from the Ultralytics repository. [Check out the
9189

9290
Sparse Transfer is very similiar to the typical transfer learing process used to train YOLOv5 models, where we fine-tune a checkpoint pretrained on COCO onto a smaller downstream dataset. With Sparse Transfer Learning, however, we simply start the fine-tuning process from a pre-sparsified checkpoint and maintain sparsity while the training process occurs.
9391

94-
Here, we will fine-tune a [75% pruned-quantized version of YOLOv5s](https://sparsezoo.neuralmagic.com/models/cv%2Fdetection%2Fyolov5-s%2Fpytorch%2Fultralytics%2Fcoco%2Fpruned75_quant-none) onto VOC.
92+
Here, we will fine-tune a [75% pruned-quantized version of YOLOv5s](https://sparsezoo.neuralmagic.com/models/yolov5-s-coco-pruned75_quantized?comparison=yolov5-s-coco-base&tab=0) onto VOC.
9593

9694
### Kick off Training
9795

9896
We will use SparseML's `sparseml.yolov5.train` training script.
9997

10098
To run sparse transfer learning, we first need to create/select a sparsification recipe. For sparse transfer, we need a recipe that instructs SparseML to maintain sparsity during training and to quantize the model over the final epochs.
10199

102-
For the VOC dataset, there is a [transfer learning recipe available in SparseZoo](https://sparsezoo.neuralmagic.com/models/cv%2Fdetection%2Fyolov5-s%2Fpytorch%2Fultralytics%2Fcoco%2Fpruned75_quant-none), identified by the following SparseZoo stub:
100+
For the VOC dataset, there is a [transfer learning recipe available in SparseZoo](https://sparsezoo.neuralmagic.com/models/yolov5-s-coco-pruned75_quantized?comparison=yolov5-s-coco-base&tab=0), found under the recipes tab and identified by the following SparseZoo stub:
103101
```bash
104102
zoo:cv/detection/yolov5-s/pytorch/ultralytics/coco/pruned75_quant-none?recipe_type=transfer_learn
105103
```

0 commit comments

Comments
 (0)