Using LangChain's ChatOpenAI to Access vLLM's Open Completion API with Extra Body #5133
xiaojinwhu
announced in
Q&A
Replies: 1 comment
-
Try this
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone,
I'm working on integrating vLLM's open completion API into my project using LangChain. I've been able to successfully use the OpenAI Python library to send requests with the required extra_body parameter, but I'm running into challenges when trying to do the same with LangChain's ChatOpenAI class.
My Question:
Has anyone successfully used LangChain's ChatOpenAI to call vLLM's open completion API with the extra_body parameter? If so, could you please share an example or point me in the right direction?
Additional Notes:
I'm using the latest versions of LangChain and vLLM.
I understand that ChatOpenAI is primarily designed for OpenAI's chat models, but I'm hoping there's a way to adapt it for this use case.
Any help or insights would be greatly appreciated!
Thanks in advance,
Beta Was this translation helpful? Give feedback.
All reactions