Skip to content

Commit 49a1cc6

Browse files
committed
Update Mistral AI function calling docs and layout
1 parent 9330187 commit 49a1cc6

File tree

4 files changed

+27
-7
lines changed

4 files changed

+27
-7
lines changed
Loading

spring-ai-docs/src/main/antora/modules/ROOT/pages/api/clients/functions/mistralai-chat-functions.adoc

Lines changed: 16 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,12 @@
1-
= Mistral Function Calling
1+
= Mistral AI Function Calling
22

33
You can register custom Java functions with the `MistralAiChatClient` and have the Mistral AI models intelligently choose to output a JSON object containing arguments to call one or many of the registered functions.
44
This allows you to connect the LLM capabilities with external tools and APIs.
55
The `mistral_small_latest` and `mistral_large_latest` models are trained to detect when a function should be called and to respond with JSON that adheres to the function signature.
66

77
The MistralAI API does not call the function directly; instead, the model generates JSON that you can use to call the function in your code and return the result back to the model to complete the conversation.
88

9-
NOTE: Currently the MistralAI API doesn't support parallel function calling, similarly to the OpenAI API, Azure OpenAI API, and Vertex AI Gemini API.
9+
NOTE: As of March 13, 2024, Mistral AI has integrated support for parallel function calling into their `mistral_large_latest`` model, a feature that was absent at the time of the first Spring AI Mistral AI.
1010

1111
Spring AI provides flexible and user-friendly ways to register and call custom functions.
1212
In general, the custom functions need to provide a function `name`, `description`, and the function call `signature` (as JSON schema) to let the model know what arguments the function expects.
@@ -191,3 +191,17 @@ NOTE: The in-prompt registered functions are enabled by default for the duration
191191
This approach allows to dynamically chose different functions to be called based on the user input.
192192

193193
The https://github.com/spring-projects/spring-ai/blob/main/spring-ai-spring-boot-autoconfigure/src/test/java/org/springframework/ai/autoconfigure/mistralai/tool/PaymentStatusPromptIT.java[PaymentStatusPromptIT.java] integration test provides a complete example of how to register a function with the `MistralAiChatClient` and use it in a prompt request.
194+
195+
196+
== Appendices
197+
198+
=== https://spring.io/blog/2024/03/06/function-calling-in-java-and-spring-ai-using-the-latest-mistral-ai-api[(Blog) Function Calling in Java and Spring AI using the latest Mistral AI API]
199+
200+
=== Mistral AI API Function Calling Flow
201+
202+
The following diagram illustrates the flow of the Mistral AI low-level API for link:https://docs.mistral.ai/guides/function-calling[Function Calling]:
203+
204+
image:mistral-ai-function-calling-flow.jpg[title="Mistral AI API Function Calling Flow", width=800, link=https://docs.mistral.ai/guides/function-calling]
205+
206+
The link:https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-mistral-ai/src/test/java/org/springframework/ai/mistralai/api/tool/PaymentStatusFunctionCallingIT.java[PaymentStatusFunctionCallingIT.java] provides a complete example on how to use the Mistral AI API function calling.
207+
It is based on the https://docs.mistral.ai/guides/function-calling[Mistral AI Function Calling tutorial].

spring-ai-docs/src/main/antora/modules/ROOT/pages/api/clients/mistralai-chat.adoc

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -108,7 +108,7 @@ This is useful if you want to use different MistralAI accounts for different mod
108108

109109
TIP: All properties prefixed with `spring.ai.mistralai.chat.options` can be overridden at runtime by adding a request specific <<chat-options>> to the `Prompt` call.
110110

111-
=== Chat Options [[chat-options]]
111+
== Chat Options [[chat-options]]
112112

113113
The link:https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-mistral-ai/src/main/java/org/springframework/ai/mistralai/MistralAiChatOptions.java[MistralAiChatOptions.java] provides model configurations, such as the model to use, the temperature, the frequency penalty, etc.
114114

@@ -131,7 +131,13 @@ ChatResponse response = chatClient.call(
131131

132132
TIP: In addition to the model specific link:https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-mistral-ai/src/main/java/org/springframework/ai/mistralai/MistralAiChatOptions.java[MistralAiChatOptions] you can use a portable https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/chat/ChatOptions.java[ChatOptions] instance, created with the https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/chat/ChatOptionsBuilder.java[ChatOptionsBuilder#builder()].
133133

134-
=== Sample Controller (Auto-configuration)
134+
== Function Calling
135+
136+
You can register custom Java functions with the MistralAiChatClient and have the Mistral AI model intelligently choose to output a JSON object containing arguments to call one or many of the registered functions.
137+
This is a powerful technique to connect the LLM capabilities with external tools and APIs.
138+
Read more about xref:api/clients/functions/mistralai-chat-functions.adoc[Mistral AI Function Calling].
139+
140+
== Sample Controller (Auto-configuration)
135141

136142
https://start.spring.io/[Create] a new Spring Boot project and add the `spring-ai-mistralai-spring-boot-starter` to your pom (or gradle) dependencies.
137143

spring-ai-docs/src/main/antora/modules/ROOT/pages/api/clients/openai-chat.adoc

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -112,7 +112,7 @@ This is useful if you want to use different OpenAI accounts for different models
112112

113113
TIP: All properties prefixed with `spring.ai.openai.chat.options` can be overridden at runtime by adding a request specific <<chat-options>> to the `Prompt` call.
114114

115-
=== Chat Options [[chat-options]]
115+
== Chat Options [[chat-options]]
116116

117117
The https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-openai/src/main/java/org/springframework/ai/openai/OpenAiChatOptions.java[OpenAiChatOptions.java] provides model configurations, such as the model to use, the temperature, the frequency penalty, etc.
118118

@@ -135,13 +135,13 @@ ChatResponse response = chatClient.call(
135135

136136
TIP: In addition to the model specific https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-openai/src/main/java/org/springframework/ai/openai/OpenAiChatOptions.java[OpenAiChatOptions] you can use a portable https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/chat/ChatOptions.java[ChatOptions] instance, created with the https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/chat/ChatOptionsBuilder.java[ChatOptionsBuilder#builder()].
137137

138-
=== Function Calling
138+
== Function Calling
139139

140140
You can register custom Java functions with the OpenAiChatClient and have the OpenAI model intelligently choose to output a JSON object containing arguments to call one or many of the registered functions.
141141
This is a powerful technique to connect the LLM capabilities with external tools and APIs.
142142
Read more about xref:api/clients/functions/openai-chat-functions.adoc[OpenAI Function Calling].
143143

144-
=== Sample Controller (Auto-configuration)
144+
== Sample Controller (Auto-configuration)
145145

146146
https://start.spring.io/[Create] a new Spring Boot project and add the `spring-ai-openai-spring-boot-starter` to your pom (or gradle) dependencies.
147147

0 commit comments

Comments
 (0)