Langchain with Local LLMs as API #10363
RamishSiddiqui
started this conversation in
General
Replies: 2 comments 3 replies
-
You can set an openai_proxy or configure the base_url when initializing |
Beta Was this translation helpful? Give feedback.
0 replies
-
You can use
|
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Has anybody tried to work with langchains that call locally deployed LLMs on my own machine. Basically langchain makes an API call to Locally deployed LLM just as it makes api call with OpenAI ChatGPT but in this call the API is local.
Any help in this regard, like what framework is used to deploy LLMs as API and how langchain will call it ?
Beta Was this translation helpful? Give feedback.
All reactions