【v0.2】'str' object has no attribute 'model_dump' #22234
-
Checked other resources
Commit to Help
Example Codeimport os
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage, SystemMessage
os.environ['OPENAI_API_KEY'] = 'sk-xxx'
model = ChatOpenAI(openai_api_base="proxy_website")
messages = [
SystemMessage(content="Translate the following from English into chinese"),
HumanMessage(content="hi!"),
]
model.invoke(messages) DescriptionI followed the official tutorial to run a simple chat case like this: url: https://python.langchain.com/v0.2/docs/tutorials/llm_chain/#detailed-walkthrough I meet some troubles: ---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
Cell In[6], [line 6](vscode-notebook-cell:?execution_count=6&line=6)
[1](vscode-notebook-cell:?execution_count=6&line=1) messages = [
[2](vscode-notebook-cell:?execution_count=6&line=2) SystemMessage(content="Translate the following from English into chinese"),
[3](vscode-notebook-cell:?execution_count=6&line=3) HumanMessage(content="hi!"),
[4](vscode-notebook-cell:?execution_count=6&line=4) ]
----> [6](vscode-notebook-cell:?execution_count=6&line=6) model.invoke(messages)
File D:\SoftWare\environemnt\myenvs\langchain\Lib\site-packages\langchain_core\language_models\chat_models.py:170, in BaseChatModel.invoke(self, input, config, stop, **kwargs)
[159](file:///D:/SoftWare/environemnt/myenvs/langchain/Lib/site-packages/langchain_core/language_models/chat_models.py:159) def invoke(
[160](file:///D:/SoftWare/environemnt/myenvs/langchain/Lib/site-packages/langchain_core/language_models/chat_models.py:160) self,
[161](file:///D:/SoftWare/environemnt/myenvs/langchain/Lib/site-packages/langchain_core/language_models/chat_models.py:161) input: LanguageModelInput,
(...)
[165](file:///D:/SoftWare/environemnt/myenvs/langchain/Lib/site-packages/langchain_core/language_models/chat_models.py:165) **kwargs: Any,
[166](file:///D:/SoftWare/environemnt/myenvs/langchain/Lib/site-packages/langchain_core/language_models/chat_models.py:166) ) -> BaseMessage:
[167](file:///D:/SoftWare/environemnt/myenvs/langchain/Lib/site-packages/langchain_core/language_models/chat_models.py:167) config = ensure_config(config)
[168](file:///D:/SoftWare/environemnt/myenvs/langchain/Lib/site-packages/langchain_core/language_models/chat_models.py:168) return cast(
[169](file:///D:/SoftWare/environemnt/myenvs/langchain/Lib/site-packages/langchain_core/language_models/chat_models.py:169) ChatGeneration,
--> [170](file:///D:/SoftWare/environemnt/myenvs/langchain/Lib/site-packages/langchain_core/language_models/chat_models.py:170) self.generate_prompt(
[171](file:///D:/SoftWare/environemnt/myenvs/langchain/Lib/site-packages/langchain_core/language_models/chat_models.py:171) [self._convert_input(input)],
[172](file:///D:/SoftWare/environemnt/myenvs/langchain/Lib/site-packages/langchain_core/language_models/chat_models.py:172) stop=stop,
[173](file:///D:/SoftWare/environemnt/myenvs/langchain/Lib/site-packages/langchain_core/language_models/chat_models.py:173) callbacks=config.get("callbacks"),
[174](file:///D:/SoftWare/environemnt/myenvs/langchain/Lib/site-packages/langchain_core/language_models/chat_models.py:174) tags=config.get("tags"),
[175](file:///D:/SoftWare/environemnt/myenvs/langchain/Lib/site-packages/langchain_core/language_models/chat_models.py:175) metadata=config.get("metadata"),
...
[545](file:///D:/SoftWare/environemnt/myenvs/langchain/Lib/site-packages/langchain_openai/chat_models/base.py:545) # will throw a "TypeError: 'NoneType' object is not iterable" error
[546](file:///D:/SoftWare/environemnt/myenvs/langchain/Lib/site-packages/langchain_openai/chat_models/base.py:546) # to mask the true error. Because 'response["choices"]' is None.
[547](file:///D:/SoftWare/environemnt/myenvs/langchain/Lib/site-packages/langchain_openai/chat_models/base.py:547) if response.get("error"):
AttributeError: 'str' object has no attribute 'model_dump' System Infopython Python 3.9.13 |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 10 replies
-
Hi @sichenyong! I'm here to help you with any bugs, questions, or contributions you have. 😊 To resolve the Here's an example of how to correctly use the import os
from langchain_openai import ChatOpenAI
from langchain_core.messages import HumanMessage, SystemMessage
os.environ['OPENAI_API_KEY'] = 'sk-xxx'
model = ChatOpenAI(openai_api_base="proxy_website")
messages = [
SystemMessage(content="Translate the following from English into Chinese"),
HumanMessage(content="hi!"),
]
response = model.invoke(messages)
print(response) Ensure that you have the correct versions of the dependencies installed. According to the
Additionally, make sure your imports are correct and that you are using the
|
Beta Was this translation helpful? Give feedback.
-
hm, seems like you're using a proxy is that right? do you see this issue when requests are sent to openai directly (without proxy)? |
Beta Was this translation helpful? Give feedback.
-
Hi, can you elaborate furthur on the solution? I'm using the correct |
Beta Was this translation helpful? Give feedback.
-
当你用openai代理时, 尝试在url后面加 |
Beta Was this translation helpful? Give feedback.
当你用openai代理时, 尝试在url后面加
/v1
, 如https://xxxx.xxxx.io/v1