Replies: 1 comment 1 reply
-
Update: I noticed that when I do a c:\PATH using the Select Developer Command Prompt for VS 2022 , I get a totally different set of paths than if I do it in VS.CODE terminal, so clearly something is up with the path discrepancy all things being equal. Not specifically what yet. Can verify: This fixed it. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
This is a hail mary after spending countless hours on this. At this point, I'm desperate and sad because I can't get back to it working again.
Back story: After much effort, I finally got PrivateGPT working on my Windows 10 box. After enjoying it for a few days, I wanted to see how H2oGPT worked so I used their window installer. BIG MISTAKE. Not only could I not get H2OGPT to work on my local machine, I now cannot get PrivateGPT to run anymore, or install, or anything due to the llama-cpp-python wheels error.
I get the error as shown below. The steps I've taken to try to resolve it are as follows:
### The error I am getting is shown below:
Using cached Pygments-2.16.1-py3-none-any.whl (1.2 MB)
Using cached msoffcrypto_tool-5.1.1-py3-none-any.whl (34 kB)
Building wheels for collected packages: llama-cpp-python
Building wheel for llama-cpp-python (pyproject.toml) ... error
error: subprocess-exited-with-error
× Building wheel for llama-cpp-python (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [131 lines of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for llama-cpp-python
Failed to build llama-cpp-python
Beta Was this translation helpful? Give feedback.
All reactions