-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Add base_url to OpenAILanguageModel #51
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
@aksg87 how does this look to you? 🤓 |
Hi @mariano, Thanks for this PR! Really appreciate your patience with the review process - things have been a bit hectic with the initial launch and the library suddenly getting a lot more usage than we expected. The base_url addition is exactly what we needed and your implementation is clean. Tests look good, everything passes, and it's ready to go. Looking forward to more contributions from you as you run into other issues or have ideas for improvements. Thanks again! |
Thanks mate! I've been following your gh workflow commits for automating the review process, including the check for community reactions. Slick! I was thinking of pushing a PR to cache pip dependencies, on my workflows it shaves off a lot of CI time, particularly when you have a python matrix of more than one version. If you agree, I can create an issue + PR for it, following this implementation |
Running live API tests... This will take a few minutes. |
✅ Live API tests passed! All endpoints are working correctly. |
Hi @mariano, Definitely create an issue for this and feel free to send that PR - because it involves infra files it might cause some failures but I can review it soon. Prioritizing things but that sounds like a great idea on first pass :) |
I notice that top_p is not defined and assigned in the constructor of class OpenAILanguageModel. It can be passed to infer() through kwargs, but lx.extract() does not take kwargs. This could cause error when using some openai-compatible server(e.g. Xinference validates top_p in the request and fails). |
* Add base_url to OpenAILanguageModel * Github action lint is outdated, so adapting * Adding base_url to parameterized test * Lint fixes to inference_test.py
* Add base_url to OpenAILanguageModel * Github action lint is outdated, so adapting * Adding base_url to parameterized test * Lint fixes to inference_test.py
* Add base_url to OpenAILanguageModel * Github action lint is outdated, so adapting * Adding base_url to parameterized test * Lint fixes to inference_test.py
Fixes #53
Addresses #80, #67, #54
Adds
base_url
parameter to OpenAILanguageModel, enabling connection to any OpenAI-compatible endpoint including:This change follows the OpenAI client API design and maintains full backward compatibility.
Usage Examples
All tests succeed with:
poetry run pytest tests/