-
Notifications
You must be signed in to change notification settings - Fork 249
Upgrade vLLM version to v0.9.2 #1652
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
78ba14a
to
a8c7838
Compare
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #1652 +/- ##
===========================================
+ Coverage 27.39% 53.92% +26.52%
===========================================
Files 56 80 +24
Lines 6191 9964 +3773
===========================================
+ Hits 1696 5373 +3677
- Misses 4495 4591 +96
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
This pull request has conflicts, please resolve those before we can evaluate the pull request. |
ef63f8a
to
c6ffd0a
Compare
Signed-off-by: Yikun Jiang <yikunkero@gmail.com>
Wait for upstream vLLM final release |
What this PR does / why we need it?
This patch upgrade vLLM version to v0.9.2, this patch didn't remove the v0.9.1 compatible code to easy review.
Does this PR introduce any user-facing change?
No
How was this patch tested?