You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I added a very descriptive title to this question.
I searched the LangChain documentation with the integrated search.
I used the GitHub search to find a similar question and didn't find it.
Commit to Help
I commit to help with one of those options 👆
Example Code
fromlangchain_openaiimportChatOpenAIfrompydanticimportBaseModel, Fieldselected_model="meta-llama/llama-3.3-70b-instruct"llm_model=ChatOpenAI(
model=selected_model,
openai_api_key=OPENROUTER_API_KEY,
openai_api_base=OPENROUTER_BASE_URL,
)
classJoke(BaseModel):
"""Joke to tell user."""setup: str=Field(description="The setup of the joke")
punchline: str=Field(description="The punchline to the joke")
rating: int=Field(description="How funny the joke is, from 1 to 10")
structured_llm=llm_model_openai.with_structured_output(
Joke.model_json_schema(), method="json_schema"
)
structured_llm.invoke("Tell me a joke about cats")
Description
I’m attempting to use LangChain with OpenRouter to get structured outputs. This works fine for OpenAI models but for other models we require the following parameter to be sent with the request:
Is there a way to get the ChatOpenAI class to insert this parameter into the calls to the API? Any suggestions for how this could be done would be appreciated!
System Info
System Information
OS: Linux
OS Version: #53-Ubuntu SMP PREEMPT_DYNAMIC Sat Jan 11 00:06:25 UTC 2025
Python Version: 3.12.3 (main, Jan 17 2025, 18:03:48) [GCC 13.3.0]
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I’m attempting to use LangChain with OpenRouter to get structured outputs. This works fine for OpenAI models but for other models we require the following parameter to be sent with the request:
Using a manual request works, for example:
Is there a way to get the
ChatOpenAI
class to insert this parameter into the calls to the API? Any suggestions for how this could be done would be appreciated!System Info
System Information
Package Information
Optional packages not installed
Other Dependencies
Beta Was this translation helpful? Give feedback.
All reactions