Skip to content

Commit 1a131c6

Browse files
authored
Update README.md
1 parent de8c605 commit 1a131c6

File tree

1 file changed

+6
-4
lines changed

1 file changed

+6
-4
lines changed

README.md

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -5,20 +5,22 @@ Currently, only includes 0.1.62 and 0.1.66.
55

66
Requirements:
77
- Windows and Linux x86_64
8+
- CPU with support for AVX or AVX2
89
- CUDA 11.6 - 12.1
910
- CPython 3.7 - 3.11
1011

1112
Installation instructions:
1213
---
1314
To install, you can use this command:
1415
```
15-
python -m pip install llama-cpp-python==0.1.66+cu117 --find-links=https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/cu117
16+
python -m pip install llama-cpp-python==0.1.66+cu177 --extra-index-url=https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/AVX2/cu117
1617
```
17-
This will install llama-cpp-python 0.1.66 for CUDA 11.7. You can change both instances of `cu117` to change the CUDA version.
18+
This will install llama-cpp-python 0.1.66 for CUDA 11.7. You can change both instances of `cu117` to change the CUDA version.
19+
You can also change `AVX2` to `AVX` if needed for your CPU.
1820

19-
An example for installing 0.1.62 for CUDA 12.1:
21+
An example for installing 0.1.62 for CUDA 12.1 on a CPU without AVX2 support:
2022
```
21-
python -m pip install llama-cpp-python==0.1.62+cu121 --find-links=https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/cu121
23+
python -m pip install llama-cpp-python==0.1.62+cu121 --extra-index-url=https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/AVX/cu121
2224
```
2325
---
2426
### All wheels are compiled using GitHub Actions

0 commit comments

Comments
 (0)