Skip to content

Commit aa55a96

Browse files
committed
Docs: add Mistral AI Function Calling doc
1 parent 0995515 commit aa55a96

File tree

6 files changed

+199
-5
lines changed

6 files changed

+199
-5
lines changed

spring-ai-docs/src/main/antora/modules/ROOT/nav.adoc

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -30,6 +30,7 @@
3030
**** xref:api/clients/vertexai-gemini-chat.adoc[VertexAI Gemini]
3131
***** xref:api/clients/functions/vertexai-gemini-chat-functions.adoc[Function Calling]
3232
*** xref:api/clients/mistralai-chat.adoc[Mistral AI]
33+
**** xref:api/clients/functions/mistralai-chat-functions.adoc[Function Calling]
3334
** xref:api/imageclient.adoc[]
3435
*** xref:api/clients/image/openai-image.adoc[OpenAI]
3536
*** xref:api/clients/image/stabilityai-image.adoc[Stability]

spring-ai-docs/src/main/antora/modules/ROOT/pages/api/clients/functions/azure-open-ai-chat-functions.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ public class MockWeatherService implements Function<Request, Response> {
5454
public record Response(double temp, Unit unit) {}
5555
5656
public Response apply(Request request) {
57-
return new Response("30", Unit.C);
57+
return new Response(30.0, Unit.C);
5858
}
5959
}
6060
----
Lines changed: 193 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,193 @@
1+
= Mistral Function Calling
2+
3+
You can register custom Java functions with the `MistralAiChatClient` and have the Mistral AI models intelligently choose to output a JSON object containing arguments to call one or many of the registered functions.
4+
This allows you to connect the LLM capabilities with external tools and APIs.
5+
The `mistral_small_latest` and `mistral_large_latest` models are trained to detect when a function should be called and to respond with JSON that adheres to the function signature.
6+
7+
The MistralAI API does not call the function directly; instead, the model generates JSON that you can use to call the function in your code and return the result back to the model to complete the conversation.
8+
9+
NOTE: Currently the MistralAI API doesn't support parallel function calling, similarly to the OpenAI API, Azure OpenAI API, and Vertex AI Gemini API.
10+
11+
Spring AI provides flexible and user-friendly ways to register and call custom functions.
12+
In general, the custom functions need to provide a function `name`, `description`, and the function call `signature` (as JSON schema) to let the model know what arguments the function expects.
13+
The `description` helps the model to understand when to call the function.
14+
15+
As a developer, you need to implement a functions that takes the function call arguments sent from the AI model, and respond with the result back to the model.
16+
Your function can in turn invoke other 3rd party services to provide the results.
17+
18+
Spring AI makes this as easy as defining a `@Bean` definition that returns a `java.util.Function` and supplying the bean name as an option when invoking the `ChatClient`.
19+
20+
Under the hood, Spring wraps your POJO (the function) with the appropriate adapter code that enables interaction with the AI Model, saving you from writing tedious boilerplate code.
21+
The basis of the underlying infrastructure is the link:https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/model/function/FunctionCallback.java[FunctionCallback.java] interface and the companion link:https://github.com/spring-projects/spring-ai/blob/main/spring-ai-core/src/main/java/org/springframework/ai/model/function/FunctionCallbackWrapper.java[FunctionCallbackWrapper.java] utility class to simplify the implementation and registration of Java callback functions.
22+
23+
== How it works
24+
25+
Suppose we want the AI model to respond with information that it does not have, for example the current temperature at a given location.
26+
27+
We can provide the AI model with metadata about our own functions that it can use to retrieve that information as it processes your prompt.
28+
29+
For example, if during the processing of a prompt, the AI Model determines that it needs additional information about the temperature in a given location, it will start a server side generated request/response interaction. The AI Model invokes a client side function.
30+
The AI Model provides method invocation details as JSON and it is the responsibility of the client to execute that function and return the response.
31+
32+
Spring AI greatly simplifies code you need to write to support function invocation.
33+
It brokers the function invocation conversation for you.
34+
You can simply provide your function definition as a `@Bean` and then provide the bean name of the function in your prompt options.
35+
You can also reference multiple function bean names in your prompt.
36+
37+
== Quick Start
38+
39+
Let's create a chatbot that answer questions by calling our own function.
40+
To support the response of the chatbot, we will register our own function that takes a location and returns the current weather in that location.
41+
42+
When the response to the prompt to the model needs to answer a question such as `"What’s the weather like in Boston?"` the AI model will invoke the client providing the location value as an argument to be passed to the function. This RPC-like data is passed as JSON.
43+
44+
Our function can some SaaS based weather service API and returns the weather response back to the model to complete the conversation.
45+
In this example we will use a simple implementation named `MockWeatherService` that hard codes the temperature for various locations.
46+
47+
The following `MockWeatherService.java` represents the weather service API:
48+
49+
[source,java]
50+
----
51+
public class MockWeatherService implements Function<Request, Response> {
52+
53+
public enum Unit { C, F }
54+
public record Request(String location, Unit unit) {}
55+
public record Response(double temp, Unit unit) {}
56+
57+
public Response apply(Request request) {
58+
return new Response(30.0, Unit.C);
59+
}
60+
}
61+
----
62+
63+
=== Registering Functions as Beans
64+
65+
With the link:../mistralai-chat.html#_auto_configuration[MistralAiChatClient Auto-Configuration] you have multiple ways to register custom functions as beans in the Spring context.
66+
67+
We start with describing the most POJO friendly options.
68+
69+
==== Plain Java Functions
70+
71+
In this approach you define `@Beans` in your application context as you would any other Spring managed object.
72+
73+
Internally, Spring AI `ChatClient` will create an instance of a `FunctionCallbackWrapper` wrapper that adds the logic for it being invoked via the AI model.
74+
The name of the `@Bean` is passed as a `ChatOption`.
75+
76+
77+
[source,java]
78+
----
79+
@Configuration
80+
static class Config {
81+
82+
@Bean
83+
@Description("Get the weather in location") // function description
84+
public Function<MockWeatherService.Request, MockWeatherService.Response> weatherFunction1() {
85+
return new MockWeatherService();
86+
}
87+
...
88+
}
89+
----
90+
91+
The `@Description` annotation is optional and provides a function description (2) that helps the model to understand when to call the function.
92+
It is an important property to set to help the AI model determine what client side function to invoke.
93+
94+
Another option to provide the description of the function is to the `@JacksonDescription` annotation on the `MockWeatherService.Request` to provide the function description:
95+
96+
[source,java]
97+
----
98+
99+
@Configuration
100+
static class Config {
101+
102+
@Bean
103+
public Function<Request, Response> currentWeather3() { // (1) bean name as function name.
104+
return new MockWeatherService();
105+
}
106+
...
107+
}
108+
109+
@JsonClassDescription("Get the weather in location") // (2) function description
110+
public record Request(String location, Unit unit) {}
111+
----
112+
113+
It is a best practice to annotate the request object with information such that the generates JSON schema of that function is as descriptive as possible to help the AI model pick the correct function to invoke.
114+
115+
The link:https://github.com/spring-projects/spring-ai/blob/main/spring-ai-spring-boot-autoconfigure/src/test/java/org/springframework/ai/autoconfigure/mistralai/tool/PaymentStatusBeanIT.java[PaymentStatusBeanIT.java] demonstrates this approach.
116+
117+
TIP: The link:https://github.com/spring-projects/spring-ai/blob/main/spring-ai-spring-boot-autoconfigure/src/test/java/org/springframework/ai/autoconfigure/mistralai/tool/PaymentStatusBeanOpenAiIT[PaymentStatusBeanOpenAiIT] implements the same function using the OpenAI API.
118+
MistralAI is almost identical to OpenAI in this regard.
119+
120+
121+
==== FunctionCallback Wrapper
122+
123+
Another way register a function is to create `FunctionCallbackWrapper` wrapper like this:
124+
125+
[source,java]
126+
----
127+
@Configuration
128+
static class Config {
129+
130+
@Bean
131+
public FunctionCallback weatherFunctionInfo() {
132+
133+
return new FunctionCallbackWrapper<>("CurrentWeather", // (1) function name
134+
"Get the weather in location", // (2) function description
135+
(response) -> "" + response.temp() + response.unit(), // (3) Response Converter
136+
new MockWeatherService()); // function code
137+
}
138+
...
139+
}
140+
----
141+
142+
It wraps the 3rd party, `MockWeatherService` function and registers it as a `CurrentWeather` function with the `MistralAiChatClient`.
143+
It also provides a description (2) and an optional response converter (3) to convert the response into a text as expected by the model.
144+
145+
NOTE: By default, the response converter does a JSON serialization of the Response object.
146+
147+
NOTE: The `FunctionCallbackWrapper` internally resolves the function call signature based on the `MockWeatherService.Request` class.
148+
149+
=== Specifying functions in Chat Options
150+
151+
To let the model know and call your `CurrentWeather` function you need to enable it in your prompt requests:
152+
153+
[source,java]
154+
----
155+
MistralAiChatClient chatClient = ...
156+
157+
UserMessage userMessage = new UserMessage("What's the weather like in Paris?");
158+
159+
ChatResponse response = chatClient.call(new Prompt(List.of(userMessage),
160+
MistralAiChatOptions.builder().withFunction("CurrentWeather").build())); // (1) Enable the function
161+
162+
logger.info("Response: {}", response);
163+
----
164+
165+
// NOTE: You can can have multiple functions registered in your `ChatClient` but only those enabled in the prompt request will be considered for the function calling.
166+
167+
Above user question will trigger 3 calls to `CurrentWeather` function (one for each city) and produce the final response.
168+
169+
=== Register/Call Functions with Prompt Options
170+
171+
In addition to the auto-configuration you can register callback functions, dynamically, with your Prompt requests:
172+
173+
[source,java]
174+
----
175+
MistralAiChatClient chatClient = ...
176+
177+
UserMessage userMessage = new UserMessage("What's the weather like in Paris?");
178+
179+
var promptOptions = MistralAiChatOptions.builder()
180+
.withFunctionCallbacks(List.of(new FunctionCallbackWrapper<>(
181+
"CurrentWeather", // name
182+
"Get the weather in location", // function description
183+
new MockWeatherService()))) // function code
184+
.build();
185+
186+
ChatResponse response = chatClient.call(new Prompt(List.of(userMessage), promptOptions));
187+
----
188+
189+
NOTE: The in-prompt registered functions are enabled by default for the duration of this request.
190+
191+
This approach allows to dynamically chose different functions to be called based on the user input.
192+
193+
The https://github.com/spring-projects/spring-ai/blob/main/spring-ai-spring-boot-autoconfigure/src/test/java/org/springframework/ai/autoconfigure/mistralai/tool/PaymentStatusPromptIT.java[PaymentStatusPromptIT.java] integration test provides a complete example of how to register a function with the `MistralAiChatClient` and use it in a prompt request.

spring-ai-docs/src/main/antora/modules/ROOT/pages/api/clients/functions/openai-chat-functions.adoc

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,7 @@ public class MockWeatherService implements Function<Request, Response> {
5555
public record Response(double temp, Unit unit) {}
5656
5757
public Response apply(Request request) {
58-
return new Response("30", Unit.C);
58+
return new Response(30.0, Unit.C);
5959
}
6060
}
6161
----
@@ -110,7 +110,7 @@ static class Config {
110110
public record Request(String location, Unit unit) {}
111111
----
112112

113-
It is a best practice to annotate the request object with information such that the generats JSON schema of that function is as descriptive as possible to help the AI model pick the correct funciton to invoke.
113+
It is a best practice to annotate the request object with information such that the generates JSON schema of that function is as descriptive as possible to help the AI model pick the correct function to invoke.
114114

115115
The link:https://github.com/spring-projects/spring-ai/blob/main/spring-ai-spring-boot-autoconfigure/src/test/java/org/springframework/ai/autoconfigure/openai/tool/FunctionCallbackWithPlainFunctionBeanIT.java[FunctionCallbackWithPlainFunctionBeanIT.java] demonstrates this approach.
116116

spring-ai-docs/src/main/antora/modules/ROOT/pages/api/clients/functions/vertexai-gemini-chat-functions.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ public class MockWeatherService implements Function<Request, Response> {
5454
public record Response(double temp, Unit unit) {}
5555
5656
public Response apply(Request request) {
57-
return new Response("30", Unit.C);
57+
return new Response(30.0, Unit.C);
5858
}
5959
}
6060
----

spring-ai-docs/src/main/antora/modules/ROOT/pages/api/functions.adoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,4 +8,4 @@ Spring AI currently supports Function invocation for the following AI Models
88
* OpenAI: Refer to the xref:api/clients/functions/openai-chat-functions.adoc[Open AI function invocation docs].
99
* VertexAI Gemini: Refer to the xref:api/clients/functions/vertexai-gemini-chat-functions.adoc[Vertex AI Gemini function invocation docs].
1010
* Azure OpenAI: Refer to the xref:api/clients/functions/azure-open-ai-chat-functions.adoc[Azure OpenAI function invocation docs].
11-
// * Mistral AI: Refer to the xref:api/clients/functions/mistralai-chat-functions.adoc[Mistral AI function invocation docs].
11+
* Mistral AI: Refer to the xref:api/clients/functions/mistralai-chat-functions.adoc[Mistral AI function invocation docs].

0 commit comments

Comments
 (0)