@@ -20,11 +20,11 @@ Once the model is downloaded, and your Ollama instance is running and accessible
20
20
21
21
Navigating to the Admin Portal in AI Server, select the ** AI Providers** menu item on the left sidebar.
22
22
23
- ![ AI Providers] ( /img/pages/ai-server/admin-dashboard.png )
23
+ ![ AI Providers] ( /img/pages/ai-server/admin-dashboard.webp )
24
24
25
25
Click on the ** New Provider** button at the top of the grid.
26
26
27
- ![ New Provider] ( /img/pages/ai-server/admin-dashboard-providers.png )
27
+ ![ New Provider] ( /img/pages/ai-server/admin-dashboard-providers.webp )
28
28
29
29
Select Ollama as the Provider Type at the top of the form, and fill in the required fields:
30
30
@@ -33,13 +33,13 @@ Select Ollama as the Provider Type at the top of the form, and fill in the requi
33
33
- ** API Key** : Optional API key to authenticate with your Ollama instance.
34
34
- ** Priority** : The priority of the provider, used to determine the order of provider selection if multiple provide the same model.
35
35
36
- ![ Ollama Provider] ( /img/pages/ai-server/admin-dashboard-ollama-provider.png )
36
+ ![ Ollama Provider] ( /img/pages/ai-server/admin-dashboard-ollama-provider.webp )
37
37
38
38
Once the URL (and optional API Key) is set, requests will be made to your Ollama instance to list available models.
39
39
By default, it will look for a locally running Ollama instance on port 11434, but you can change the URL to point to your Ollama instance.
40
40
These will then be displayed as options to enable for the provider you are configuring.
41
41
42
- ![ Ollama Models] ( /img/pages/ai-server/ollama-models.png )
42
+ ![ Ollama Models] ( /img/pages/ai-server/ollama-models.webp )
43
43
44
44
Select the models you want to enable for this provider, and click ** Save** to save the provider configuration.
45
45
0 commit comments