Local model
#157
Replies: 1 comment
-
Hi @Nihir2904, Currently, this project is only targeted for the OpenAI API https://api.openai.com/v1/chat/completions and doesn't work with local models. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Heyy, what changes should make to run a local model on it?
I have a llama-2b model.
Beta Was this translation helpful? Give feedback.
All reactions