Replies: 1 comment
-
Pass response = llm.with_structured_output(
schema=ResponseModel, method="json_mode", include_raw=True
).invoke(prompt)
print(type(response))
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I am trying to get the metadata of an LLM response. However, when using
with_structured_output
, it returns only the parsed response, nothing else.System Info
➜ uv pip freeze | grep lang
google-ai-generativelanguage==0.6.18
langchain==0.3.26
langchain-core==0.3.67
langchain-google-genai==2.1.6
langchain-text-splitters==0.3.8
langgraph==0.5.1
langgraph-checkpoint==2.1.0
langgraph-prebuilt==0.5.2
langgraph-sdk==0.1.72
langsmith==0.4.4
Beta Was this translation helpful? Give feedback.
All reactions