File tree Expand file tree Collapse file tree 1 file changed +6
-4
lines changed Expand file tree Collapse file tree 1 file changed +6
-4
lines changed Original file line number Diff line number Diff line change @@ -5,20 +5,22 @@ Currently, only includes 0.1.62 and 0.1.66.
5
5
6
6
Requirements:
7
7
- Windows and Linux x86_64
8
+ - CPU with support for AVX or AVX2
8
9
- CUDA 11.6 - 12.1
9
10
- CPython 3.7 - 3.11
10
11
11
12
Installation instructions:
12
13
---
13
14
To install, you can use this command:
14
15
```
15
- python -m pip install llama-cpp-python==0.1.66+cu117 --find-links =https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/cu117
16
+ python -m pip install llama-cpp-python==0.1.66+cu177 --extra-index-url =https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/AVX2 /cu117
16
17
```
17
- This will install llama-cpp-python 0.1.66 for CUDA 11.7. You can change both instances of ` cu117 ` to change the CUDA version.
18
+ This will install llama-cpp-python 0.1.66 for CUDA 11.7. You can change both instances of ` cu117 ` to change the CUDA version.
19
+ You can also change ` AVX2 ` to ` AVX ` if needed for your CPU.
18
20
19
- An example for installing 0.1.62 for CUDA 12.1:
21
+ An example for installing 0.1.62 for CUDA 12.1 on a CPU without AVX2 support :
20
22
```
21
- python -m pip install llama-cpp-python==0.1.62+cu121 --find-links =https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/cu121
23
+ python -m pip install llama-cpp-python==0.1.62+cu121 --extra-index-url =https://jllllll.github.io/llama-cpp-python-cuBLAS-wheels/AVX /cu121
22
24
```
23
25
---
24
26
### All wheels are compiled using GitHub Actions
You can’t perform that action at this time.
0 commit comments