Add support for Batch API of LLMs like OpenAI and Anthropic #27616
Replies: 6 comments 3 replies
-
Bumping this as I added a discussion here, looking for the same thing for Google's Batch request langchain-ai/langchain-google#599. Are there any plans to support this sometime soon or on the roadmap? I'd be happy to help look into this if some guidance is provided as to where is the best place to start or pieces of code to look at that we can leverage to get something working. |
Beta Was this translation helpful? Give feedback.
-
I've successfully integrated Langchain to my test code but my main production code would definitely use the batch processing from Anthropic because of the significant saving and because I'm analysing several hundred images on a daily basis potentially. I'd still like to use Langchain but after reading the above, I will just hack it to use Langsmith logging instead. |
Beta Was this translation helpful? Give feedback.
-
@garethcurtis |
Beta Was this translation helpful? Give feedback.
-
I can't believe it. Many people have this request many times in multiple posts, the Langchain still doesn't support batch API yet?! |
Beta Was this translation helpful? Give feedback.
-
I was also surprised there's no native support for batch APIs given how common they are in production LLM workflows. Seems like the tricky part is that batching essentially "breaks the chain" since you have to wait for the batch to complete before resuming any chain logic. Seems like this runs somewhat counter to LangChain's design philosophy of frictionless pluggability e.g. extending a high level interface like Runnable with One idea: extend BaseChatModel with Might be naive, but could be a path forward... |
Beta Was this translation helpful? Give feedback.
-
It also occurred to me that there really should be an integration to provide the option to leverage Batch APIs for running LangSmith evaluations in the cloud. That actually would be a pluggable solution. Also could abstract frustrating things like batch expiration (I'm looking at you OpenAI). I figure people are more willing to pay for tools that save them money, right? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked
Feature request
Large Language Models like OpenAI and Anthropic supports Batch request of prompts using their APIs with enhanced token limit and reduced cost. Currently those are not getting supported in current version of Langchain.
Motivation
I am a fullstack LLM apps developer and my current project which is utilizing langchain needed support for the Batch APIs of OpenAI.
Proposal (If applicable)
We should define adapters and standard intefaces for the Batch APIs within
langchain_core
and leave the implementation to the packages such aslangchain_openai
,langchain_anthropic
and also for others as well.Beta Was this translation helpful? Give feedback.
All reactions