Failed to call the "abatch" method of OpenAI when the config "max_concurrency" is not set to 1 #20077
Replies: 1 comment 2 replies
-
Hey @pursuemoon! Great to see you diving deep into LangChain again. How's everything been since our last chat? Based on the code you've shared and the error you're encountering, it seems like the issue arises from the way LangChain's To resolve this issue and utilize higher concurrency levels, you need to modify the async def abatch(
self,
inputs: List[LanguageModelInput],
config: Optional[Union[RunnableConfig, List[RunnableConfig]]] = None,
*,
return_exceptions: bool = False,
**kwargs: Any,
) -> List[str]:
if not inputs:
return []
config = get_config_list(config, len(inputs))
max_concurrency = config[0].get("max_concurrency", 1) # Default to 1 if not set
# Split inputs into chunks based on max_concurrency
input_chunks = [inputs[i:i + max_concurrency] for i in range(0, len(inputs), max_concurrency)]
results = []
for chunk in input_chunks:
# Use asyncio.gather to run predictions concurrently within each chunk
chunk_results = await asyncio.gather(*[
self.ainvoke(input, config=config[i], **kwargs) for i, input in enumerate(chunk)
], return_exceptions=return_exceptions)
results.extend(chunk_results)
return results This modification ensures that within each chunk of inputs, predictions are run concurrently up to the limit specified by Regarding your question about any known limitations or restrictions in the
These limitations and restrictions highlight the importance of proper configuration, error handling, and understanding the operational constraints of the Sources
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I am writing a asynchronous function to process batch translation with a chain constructed by LCEL and with
OpenAI
. But it raise a ValueError when the value of the configuremax_concurrency
is set to more than 1. At the mean time, this function works when the value ofmax_concurrency
is not 1. If I change to useLLMChain
, this error will not happen, which is expected.I am confused because calling with a batch size more than 1 is normal and reasonable. It is expected that no matter how high
max_concurrency
is set,batch
orabatch
could work.I am really interested to use advanced LCEL feature when developing. Thank everyone to help me answer this question!
System Info
System Information
Package Information
Packages not installed (Not Necessarily a Problem)
The following packages were not found:
Beta Was this translation helpful? Give feedback.
All reactions