the gpt4all model is not working #1140
Unanswered
SrinivasaKalyan
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
PS D:\D\project\LLM\Private-Chatbot> python privateGPT.py
gguf_init_from_file: invalid magic number 67676d6c
gguf_init_from_file: invalid magic number 67676d6c
gguf_init_from_file: invalid magic number 67676d6c
Traceback (most recent call last):
File "D:\D\project\LLM\Private-Chatbot\privateGPT.py", line 95, in
main()
File "D:\D\project\LLM\Private-Chatbot\privateGPT.py", line 50, in main
llm = GPT4All(model=model_path, max_tokens=model_n_ctx, backend='gpt2', n_batch=model_n_batch, callbacks=callbacks, verbose=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\srini\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\langchain\load\serializable.py", line 74, in init
super().init(**kwargs)
File "pydantic\main.py", line 341, in pydantic.main.BaseModel.init
pydantic.error_wrappers.ValidationError: 1 validation error for GPT4All
root
Unable to instantiate model: code=129, Model format not supported (no matching implementation found) (type=value_error)
Beta Was this translation helpful? Give feedback.
All reactions