From ad5f1e8a4212895195a0264695b0f83b04723381 Mon Sep 17 00:00:00 2001 From: =?UTF-8?q?Mohammad-Ali=20A=27r=C3=A2bi?= Date: Tue, 22 Apr 2025 03:12:02 +0200 Subject: [PATCH] Add Docker Model Runner --- content/guides/genai-pdf-bot/containerize.md | 2 +- content/guides/genai-pdf-bot/develop.md | 14 ++++++++++++++ 2 files changed, 15 insertions(+), 1 deletion(-) diff --git a/content/guides/genai-pdf-bot/containerize.md b/content/guides/genai-pdf-bot/containerize.md index eae8f42ae2b..738f2d09d63 100644 --- a/content/guides/genai-pdf-bot/containerize.md +++ b/content/guides/genai-pdf-bot/containerize.md @@ -12,7 +12,7 @@ aliases: > [!NOTE] > -> GenAI applications can often benefit from GPU acceleration. Currently Docker Desktop supports GPU acceleration only on [Windows with the WSL2 backend](/manuals/desktop/features/gpu.md#using-nvidia-gpus-with-wsl2). Linux users can also access GPU acceleration using a native installation of the [Docker Engine](/manuals/engine/install/_index.md). +> GenAI applications can often benefit from GPU acceleration. Currently, Docker Desktop supports GPU acceleration only on [Windows with the WSL2 backend](/manuals/desktop/features/gpu.md#using-nvidia-gpus-with-wsl2). Linux users can also access GPU acceleration using a native installation of the [Docker Engine](/manuals/engine/install/_index.md). On Mac, one can use [Docker Model Runner](https://docs.docker.com/desktop/features/model-runner/) to run models natively. - You have installed the latest version of [Docker Desktop](/get-started/get-docker.md) or, if you are a Linux user and are planning to use GPU acceleration, [Docker Engine](/manuals/engine/install/_index.md). Docker adds new features regularly and some parts of this guide may work only with the latest version of Docker Desktop. - You have a [git client](https://git-scm.com/downloads). The examples in this section use a command-line based git client, but you can use any client. diff --git a/content/guides/genai-pdf-bot/develop.md b/content/guides/genai-pdf-bot/develop.md index 7a0dcae854e..6a765d49951 100644 --- a/content/guides/genai-pdf-bot/develop.md +++ b/content/guides/genai-pdf-bot/develop.md @@ -201,6 +201,20 @@ To run Ollama outside of a container: $ ollama pull llama2 ``` +{{< /tab >}} +{{< tab name="Use Docker Model Runner" >}} + +Docker Model Runner is compatible with Ollama in the API. + +1. Make sure your OS and Docker Desktop support [Docker Model Runner](https://docs.docker.com/desktop/features/model-runner/). + Docker Model Runner was initially released for Mac Silicon on Docker Desktop 4.40. +2. Download the model of your choice: + ```console + $ docker model pull ai/llama3.3 + ``` +3. Update the `OLLAMA_BASE_URL` value in your `.env` file to + `http://model-runner.docker.internal`. + {{< /tab >}} {{< tab name="Use OpenAI" >}}