Skip to content

Update llm.py #690

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

Update llm.py #690

wants to merge 1 commit into from

Conversation

th3-m3ss14h
Copy link

This line change is all I needed to rectify the following error that was encountered when following the readme's instructions to the letter on indows 11:

Traceback (most recent call last):
File "D:\ai\applications\devika.venv\Lib\site-packages\flask\app.py", line 1511, in wsgi_app
response = self.full_dispatch_request()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ai\applications\devika.venv\Lib\site-packages\flask\app.py", line 919, in full_dispatch_request
rv = self.handle_user_exception(e)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ai\applications\devika.venv\Lib\site-packages\flask_cors\extension.py", line 176, in wrapped_function
return cors_after_request(app.make_response(f(*args, **kwargs)))
^^^^^^^^^^^^^^^^^^
File "D:\ai\applications\devika.venv\Lib\site-packages\flask\app.py", line 917, in full_dispatch_request
rv = self.dispatch_request()
^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ai\applications\devika.venv\Lib\site-packages\flask\app.py", line 902, in dispatch_request
return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args) # type: ignore[no-any-return]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ai\applications\devika\src\logger.py", line 59, in wrapper
response = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "D:\ai\applications\devika\devika.py", line 63, in data
models = LLM().list_models()
^^^^^
File "D:\ai\applications\devika\src\llm\llm.py", line 72, in init
self.models["OLLAMA"] = [(model["name"], model["name"]) for model in ollama.models]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ai\applications\devika\src\llm\llm.py", line 72, in
self.models["OLLAMA"] = [(model["name"], model["name"]) for model in ollama.models]
~~~~~^^^^^^^^
File "D:\ai\applications\devika.venv\Lib\site-packages\ollama_types.py", line 33, in getitem
raise KeyError(key)
KeyError: 'name'

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant