LocalAI: OpenAI API compatible based on llama.cpp and ggml #1201
mudler
started this conversation in
Show and tell
Replies: 1 comment 2 replies
-
Hi @mudler , I was looking for efficient API to connect to llama.cpp, is your API still make llama.cpp responsive ? Thanks, and love the work |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi 👋
First of all, as always, a big thanks to @ggerganov for the impressive work - I will never stop saying it.
I've created https://github.com/go-skynet/LocalAI and thought to share it here. It's a Self-hosted, community-driven simple local OpenAI-compatible API written in go. It can be used as a drop-in replacement for OpenAI. The scope is to use code bindings to create a generic API that runs ggml's supported model efficiently (including GPT4ALL, or StableLM) under the same API umbrella without friction from the user (since there are many llama.cpp fork/based code, I sensed the need to make them in a single, convenient place for the user).
It is already having big impact on different projects that are based so far on OpenAI APIs, for instance for Kubernetes cluster analysis: https://medium.com/@tyler_97636/k8sgpt-localai-unlock-kubernetes-superpowers-for-free-584790de9b65
Here is the Github repo: https://github.com/go-skynet/LocalAI
Beta Was this translation helpful? Give feedback.
All reactions