How to calculate the cost of calling an LLM with structured output #30321
AsmaaMHadir
announced in
Ask Dosu (Archived)
Replies: 1 comment 2 replies
-
To calculate the cost of calling an LLM with structured output in your LangGraph graph, you can use the
Here's an example based on your provided code: async def generate_answer(self, state: State):
"""Generate the final answer based on context."""
formatted_context = LangGraphUtils.format_context(state["context"])
prompt_input = {
"question": state["messages"][-1].content,
"context": formatted_context,
"chat_history": state["messages"]
}
out_prompt = ChatPromptTemplate.from_template(generation_prompt_template_v3).invoke(prompt_input)
try:
from langchain_community.callbacks.openai_info import OpenAICallbackHandler
cb = OpenAICallbackHandler()
# Initialize the LLM with structured output
response = await LangGraphConfig.initialize_llm().with_structured_output(
GeneratedAnswer, method="function_calling"
).ainvoke(out_prompt)
confidence_score = LangGraphUtils.calculate_weighted_confidence_score(response.response_metadata)
state["answer"] = response.content or "Could not generate an answer."
state["confidence_score"] = confidence_score
# Access token usage and cost from the callback handler
return {
"answer": state["answer"],
"context": state["context"],
"confidence_score": state["confidence_score"],
"input_tokens": cb.prompt_tokens,
"output_tokens": cb.completion_tokens,
"total_tokens": cb.total_tokens,
"cost": state["cost"] + cb.total_cost
}
except Exception:
logger.error("Error generating answer:\n" + traceback.format_exc())
return None This setup allows you to track the token usage and cost using the To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I have the following node in my LangGraph graph:
I tried using
with get_openai_callback() as CB to calculate the cost and get the tokens count:
but I get an error since I am using the llm with structured outputBeta Was this translation helpful? Give feedback.
All reactions