Skip to content

Commit 2b8c992

Browse files
committed
Add docs for Ollama's template request parameter
1 parent 15628a3 commit 2b8c992

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

spring-ai-docs/src/main/antora/modules/ROOT/pages/api/chat/ollama-chat.adoc

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -64,7 +64,8 @@ Here are the advanced request parameter for the Ollama chat client:
6464
| spring.ai.ollama.chat.enabled | Enable Ollama chat client. | true
6565
| spring.ai.ollama.chat.options.model | The name of the https://github.com/ollama/ollama?tab=readme-ov-file#model-library[supported models] to use. | mistral
6666
| spring.ai.ollama.chat.options.format | The format to return a response in. Currently the only accepted value is `json` | -
67-
| spring.ai.ollama.chat.options.keep_alive | controls how long the model will stay loaded into memory following the request | 5m
67+
| spring.ai.ollama.chat.options.keep_alive | Controls how long the model will stay loaded into memory following the request | 5m
68+
| spring.ai.ollama.chat.options.template | The prompt template to use (overrides what is defined in the Modelfile) | -
6869
|====
6970

7071
The `options` properties are based on the link:https://github.com/jmorganca/ollama/blob/main/docs/modelfile.md#valid-parameters-and-values[Ollama Valid Parameters and Values] and link:https://github.com/jmorganca/ollama/blob/main/api/types.go[Ollama Types]. The default values are based on: link:https://github.com/ollama/ollama/blob/b538dc3858014f94b099730a592751a5454cab0a/api/types.go#L364[Ollama type defaults].

0 commit comments

Comments
 (0)