Replies: 1 comment
-
Looks like '--alias ...' parameter works for server binary. https://github.com/ggerganov/llama.cpp/blob/master/examples/server/README.md |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi!
Llama.cpp HTTP Server seems to output the full path of the model
I have tested to override the model name with
--override-kv general.name=str:gpt-3.5-turbo
but it doesn't work as the api will output the full path of the file and not the general.name.One solution might be to change this behavior and make llama.cpp server api output the id as the kv general.name instead of the path of the model
Another solution is to add a parameter like --override-model-name to change this
For example, currently this:
curl http://192.168.1.151:5000/v1/models
Will output:
{"object":"list","data":[{"id":"/home/ubuntuai/models/WizardLM-2-8x22B-Q4_K_M.gguf","object":"model","created":1721242854,"owned_by":"llamacpp","meta":{"vocab_type":1,"n_vocab":32000,"n_ctx_train":65536,"n_embd":6144,"n_params":140620634112,"size":85591511040}}]}%
I think it's better by default to output the general.name or give the possibility to override the name passing a parameter.
Beta Was this translation helpful? Give feedback.
All reactions