-
Notifications
You must be signed in to change notification settings - Fork 1.7k
Description
Bug description
While using the AzureOpenAI chat client streaming won't work. It gets into this nullpointer. When I subscribe and print out the .content() Flux; I noticed that the last received token is null
.
2025-04-10 17:38:22.611 [http-nio-8080-exec-7] ERROR o.a.c.c.C.[.[.[.[dispatcherServlet].log - Servlet.service() for servlet [dispatcherServlet] threw exception
java.lang.NullPointerException: Cannot invoke "com.azure.ai.openai.models.ChatResponseMessage.getToolCalls()" because "responseMessage" is null
at org.springframework.ai.azure.openai.AzureOpenAiChatModel.buildGeneration(AzureOpenAiChatModel.java:498)
Suppressed: reactor.core.publisher.FluxOnAssembly$OnAssemblyException:
Assembly trace from producer [reactor.core.publisher.FluxMapFuseable] :
reactor.core.publisher.Flux.map(Flux.java:6588)
org.springframework.ai.azure.openai.AzureOpenAiChatModel.lambda$internalStream$13(AzureOpenAiChatModel.java:381)
Environment
Spring-AI 1.0.0-M6
Chat Model: Azure OpenAI
Steps to reproduce
Set a custom content-filer on your model: Azure AI Foundry->Safety+Security->Create content filter->Output Filter->Streaming mode (Preview)->Asynchronous Filter
chatClient.prompt().user("How are you?").stream().content().doOnEach(data -> System.out.println(data.get()));
//output
How
can
I
assist
you
today
?
null
Expected behavior
AzureOpenAiChatModel should be null safe or null values should be filtered out.
Minimal Complete Reproducible example
See above. When I switch to another vendor like Anthrophic the result is as expected (without a null at the end of the stream)