with openllm
#13765
Replies: 1 comment
-
Hi @melfebulu i am also facing this issue now can you explain how you solved this issue. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
from langchain.llms import OpenLLM
llm = OpenLLM(server_url='http://localhost:8001')
llm("What is the difference between a duck and a goose?") 报错 TypeError Traceback (most recent call last)
in <cell line: 5>()
3
4 llm = OpenLLM(server_url='http://localhost:8001')
----> 5 llm("What is the difference between a duck and a goose?")
3 frames
/usr/local/lib/python3.10/dist-packages/langchain/llms/openllm.py in _identifying_params(self)
218 """Get the identifying parameters."""
219 if self._client is not None:
--> 220 self.llm_kwargs.update(self._client._config())
221 model_name = self._client._metadata()["model_name"]
222 model_id = self._client._metadata()["model_id"]
TypeError: 'dict' object is not callable
--------------------------------- chatgpt answer below----------------------
The error message you're encountering indicates a TypeError occurring in your code when trying to call a dictionary object. This error arises within the _identifying_params method in the langchain/llms/openllm.py file, specifically when executing self.llm_kwargs.update(self._client._config()). It seems that _config is mistakenly being treated as a callable object (like a function or method).
To address this issue, you might want to consider the following steps:
Check the OpenLLM Class: First, ensure you have a correct understanding of the OpenLLM class and how it uses the _client object. Specifically, you need to discern whether _config is a method or an attribute.
If _config Is an Attribute: If _config is a dictionary attribute, not a method, then you should use it directly instead of trying to call it. The code should be modified to self.llm_kwargs.update(self._client._config).
If _config Is a Method: If _config is indeed a method, there might be an error in the implementation of the OpenLLM class. In this case, you may need to check the definition of the OpenLLM class or refer to the relevant documentation to understand the correct usage.
Check the Library Version: Ensure that you are using the latest version of the langchain library. Sometimes, such errors might be bugs in earlier versions of a library, which have been rectified in later releases.
Consult Documentation or Seek Help: If you're still unsure about how to resolve this issue, consider consulting the documentation of the langchain library or seek help in relevant developer forums or communities.
Report the Issue: If you believe this to be a bug in the langchain library and can't find a solution in the documentation, consider reporting the issue to the maintainers of the library.
Beta Was this translation helpful? Give feedback.
All reactions