Optionally fetch list of models from API #394
keegancsmith
started this conversation in
Ideas
Replies: 1 comment
-
@keegancsmith I get a list of models only for my self-hosted Ollama service. (setq gjg/ollama-api-endpoint "http://1.2.3.4:11434")
(defun gjg/get-ollama-models ()
"Return a list of Ollama models from the API, or nil if the API is not available."
(when-let*
((url (url-generic-parse-url gjg/ollama-api-endpoint))
(host (url-host url))
(port (url-port url))
;; Use short timeout with curl to assure API port is open
(maybe-ollama-models (s-split "\n"
(shell-command-to-string (concat "curl -s --connect-timeout 0.5 '"
gjg/ollama-api-endpoint "/api/tags' | jq -r '.models[].name'"))))
(my-ollama-models (seq-filter (lambda (s) (not (string= s "")))
maybe-ollama-models)))
(gptel-make-ollama
"Ollama"
:host (concat host ":" (number-to-string port))
:models my-ollama-models
:stream nil)
my-ollama-models))
;; set globally, as ollama-models is relevant to more than gptel
(setq ollama-models (gjg/get-ollama-models))
;; if API port was not found, remove backend
(unless ollama-models
(setq gptel--known-backends
(assoc-delete-all "Ollama" gptel--known-backends #'equal))) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
A lot of endpoints support getting a list of the supported models. Right now gptel hardcodes the list of supported models, or we need to configure it via :models when we create the backend. It would be cool if it was possible to make this dynamic. EG maybe the models slot could also be a function?
Beta Was this translation helpful? Give feedback.
All reactions