Replies: 4 comments 12 replies
-
Agree!! It would be fantastic for llama.cpp to have an OpenAI compatible API. |
Beta Was this translation helpful? Give feedback.
11 replies
-
Have you tried api_like_OAI.py? |
Beta Was this translation helpful? Give feedback.
0 replies
-
Server API documentation. It's not comprehensive and may get out of date sometimes, but it's enough to get you started. |
Beta Was this translation helpful? Give feedback.
1 reply
-
@shibe2 thanks for those links. Was not aware of either of them. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I know that llama-cpp-python has openai compatibility https://abetlen.github.io/llama-cpp-python/ but it seems to me this functionality would be better in llama.cpp's server.
The server currently has an api, but I'm not aware of any documentation for it, or how compatible it is to openai.
I'm using autogen with text-generation-webui and it works. I'd rather keep things simpler and just use the llama.cpp server.
Beta Was this translation helpful? Give feedback.
All reactions