You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
NOTE: The `spring-ai-ollama` dependency provides access also to the `OllamaEmbeddingClient`.
37
-
For more information about the `OllamaEmbeddingClient` refer to the link:../embeddings/ollama-embeddings.html[Ollama Embedding Client] section.
38
-
39
-
Next, create an `OllamaChatClient` instance and use it to text generations requests:
40
-
41
-
[source,java]
42
-
----
43
-
var ollamaApi = new OllamaApi();
44
-
45
-
var chatClient = new OllamaChatClient(ollamaApi).withModel(MODEL)
46
-
.withDefaultOptions(OllamaOptions.create()
47
-
.withModel(OllamaOptions.DEFAULT_MODEL)
48
-
.withTemperature(0.9f));
49
-
50
-
ChatResponse response = chatClient.call(
51
-
new Prompt("Generate the names of 5 famous pirates."));
52
-
53
-
// Or with streaming responses
54
-
Flux<ChatResponse> response = chatClient.stream(
55
-
new Prompt("Generate the names of 5 famous pirates."));
56
-
----
57
-
58
-
The `OllamaOptions` provides the configuration information for all chat requests.
59
-
60
-
==== ChatOptions and OllamaOptions
61
-
62
-
The https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-ollama/src/main/java/org/springframework/ai/ollama/api/OllamaOptions.java[OllamaOptions.java] provides provides configuration information for the chat requests, such as the model to use, the temperature, the frequency penalty, etc.
63
-
64
-
The default options can be configured using the `spring.ai.ollama.chat.options` properties as well.
65
-
66
-
On start-time use the `OllamaChatClient#withDefaultOptions()` to set the default options applicable for all chat completion requests.
67
-
At run-time you can override the default options with `OllamaOptions` instance in the request `Prompt`.
68
-
69
-
For example to override the default model name and temperature for a specific request:
70
-
71
-
[source,java]
72
-
----
73
-
ChatResponse response = chatClient.call(
74
-
new Prompt(
75
-
"Generate the names of 5 famous pirates.",
76
-
OllamaOptions.create()
77
-
.withModel("llama2")
78
-
.withTemperature(0.4)
79
-
));
80
-
----
81
-
82
-
You can use as prompt options any instance that implements the portable `ChatOptions` interface.
83
-
For example you can use the `ChatOptionsBuilder` to create a portable prompt options.
84
-
85
-
=== OllamaChatClient Auto-configuration
13
+
== Auto-configuration
86
14
87
15
Spring AI provides Spring Boot auto-configuration for the Ollama Chat Client.
88
16
To enable it add the following dependency to your project's Maven `pom.xml` file:
@@ -107,32 +35,7 @@ dependencies {
107
35
108
36
NOTE: Refer to the xref:getting-started.adoc#_dependency_management[Dependency Management] section to add Milestone and/or Snapshot Repositories to your build file.
109
37
110
-
111
-
==== Sample Code
112
-
113
-
This will create a `ChatClient` implementation that you can inject into your class.
114
-
Here is an example of a simple `@Controller` class that uses the `ChatClient` implementation.
115
-
116
-
[source,java]
117
-
----
118
-
@RestController
119
-
public class ChatController {
120
-
121
-
private final ChatClient chatClient;
122
-
123
-
@Autowired
124
-
public ChatController(ChatClient chatClient) {
125
-
this.chatClient = chatClient;
126
-
}
127
-
128
-
@GetMapping("/ai/generate")
129
-
public Map generate(@RequestParam(value = "message", defaultValue = "Tell me a joke") String message) {
NOTE: The `spring-ai-ollama` dependency provides access also to the `OllamaEmbeddingClient`.
145
+
For more information about the `OllamaEmbeddingClient` refer to the link:../embeddings/ollama-embeddings.html[Ollama Embedding Client] section.
146
+
147
+
Next, create an `OllamaChatClient` instance and use it to text generations requests:
148
+
149
+
[source,java]
150
+
----
151
+
var ollamaApi = new OllamaApi();
152
+
153
+
var chatClient = new OllamaChatClient(ollamaApi).withModel(MODEL)
154
+
.withDefaultOptions(OllamaOptions.create()
155
+
.withModel(OllamaOptions.DEFAULT_MODEL)
156
+
.withTemperature(0.9f));
157
+
158
+
ChatResponse response = chatClient.call(
159
+
new Prompt("Generate the names of 5 famous pirates."));
160
+
161
+
// Or with streaming responses
162
+
Flux<ChatResponse> response = chatClient.stream(
163
+
new Prompt("Generate the names of 5 famous pirates."));
164
+
----
165
+
166
+
The `OllamaOptions` provides the configuration information for all chat requests.
167
+
168
+
=== Chat Options
169
+
170
+
The https://github.com/spring-projects/spring-ai/blob/main/models/spring-ai-ollama/src/main/java/org/springframework/ai/ollama/api/OllamaOptions.java[OllamaOptions.java] provides provides configuration information for the chat requests, such as the model to use, the temperature, the frequency penalty, etc.
171
+
172
+
The default options can be configured using the `spring.ai.ollama.chat.options` properties as well.
173
+
174
+
On start-time use the `OllamaChatClient#withDefaultOptions()` to set the default options applicable for all chat completion requests.
175
+
At run-time you can override the default options with `OllamaOptions` instance in the request `Prompt`.
176
+
177
+
For example to override the default model name and temperature for a specific request:
178
+
179
+
[source,java]
180
+
----
181
+
ChatResponse response = chatClient.call(
182
+
new Prompt(
183
+
"Generate the names of 5 famous pirates.",
184
+
OllamaOptions.create()
185
+
.withModel("llama2")
186
+
.withTemperature(0.4)
187
+
));
188
+
----
189
+
190
+
You can use as prompt options any instance that implements the portable `ChatOptions` interface.
191
+
For example you can use the `ChatOptionsBuilder` to create a portable prompt options.
0 commit comments