Skip to content

Commit 0a1c390

Browse files
authored
fix: broken alias prop (#23014)
1 parent a949a73 commit 0a1c390

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

content/manuals/ai/compose/models-and-compose.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: Define AI Models in Docker Compose applications
33
linkTitle: Use AI models in Compose
44
description: Learn how to define and use AI models in Docker Compose applications using the models top-level element
55
keywords: compose, docker compose, models, ai, machine learning, cloud providers, specification
6-
alias:
6+
aliases:
77
- /compose/how-tos/model-runner/
88
- /ai/compose/model-runner/
99
weight: 10
@@ -68,14 +68,14 @@ models:
6868
```
6969

7070
Common configuration options include:
71-
- `model` (required): The OCI artifact identifier for the model. This is what Compose pulls and runs via the model runner.
71+
- `model` (required): The OCI artifact identifier for the model. This is what Compose pulls and runs via the model runner.
7272
- `context_size`: Defines the maximum token context size for the model.
73-
73+
7474
> [!NOTE]
7575
> Each model has its own maximum context size. When increasing the context length,
7676
> consider your hardware constraints. In general, try to keep context size
7777
> as small as feasible for your specific needs.
78-
78+
7979
- `runtime_flags`: A list of raw command-line flags passed to the inference engine when the model is started.
8080
For example, if you use llama.cpp, you can pass any of [the available parameters](https://github.com/ggml-org/llama.cpp/blob/master/tools/server/README.md).
8181
- Platform-specific options may also be available via extension attributes `x-*`
@@ -172,7 +172,7 @@ Docker Model Runner will:
172172
>
173173
> This approach is deprecated. Use the [`models` top-level element](#basic-model-definition) instead.
174174

175-
You can also use the `provider` service type, which allows you to declare platform capabilities required by your application.
175+
You can also use the `provider` service type, which allows you to declare platform capabilities required by your application.
176176
For AI models, you can use the `model` type to declare model dependencies.
177177

178178
To define a model provider:

0 commit comments

Comments
 (0)