-
-
Notifications
You must be signed in to change notification settings - Fork 8.7k
Start using py3.12 for TPU. #21000
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Start using py3.12 for TPU. #21000
Conversation
👋 Hi! Thank you for contributing to the vLLM project. 💬 Join our developer Slack at https://slack.vllm.ai to discuss your PR in #pr-reviews, coordinate on features in #feat- channels, or join special interest groups in #sig- channels. Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run Once the PR is approved and ready to go, your PR reviewer(s) can run CI to test the changes comprehensively before merging. To run CI, PR reviewers can either: Add 🚀 |
This pull request has merge conflicts that must be resolved before it can be |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request updates the TPU Dockerfile and requirements to use Python 3.12. There is a potential issue with mismatched nightly dates between the base image and the installed packages. Please ensure the dates are aligned.
requirements/tpu.txt
Outdated
torch==2.9.0.dev20250714 | ||
torchvision==0.24.0.dev20250714 | ||
torch_xla[tpu, pallas] @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.9.0.dev20250714-cp39-cp39-linux_x86_64.whl ; python_version == "3.9" | ||
torch_xla[tpu, pallas] @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.9.0.dev20250714-cp310-cp310-linux_x86_64.whl ; python_version == "3.10" | ||
torch_xla[tpu, pallas] @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.9.0.dev20250714-cp311-cp311-linux_x86_64.whl ; python_version == "3.11" | ||
torch_xla[tpu, pallas] @ https://storage.googleapis.com/pytorch-xla-releases/wheels/tpuvm/torch_xla-2.9.0.dev20250714-cp312-cp312-linux_x86_64.whl ; python_version == "3.12" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
7435fd5
to
d410d8e
Compare
Signed-off-by: Xiongfei Wei <isaacwxf23@gmail.com>
Signed-off-by: Xiongfei Wei <isaacwxf23@gmail.com>
Signed-off-by: Xiongfei Wei <isaacwxf23@gmail.com>
Signed-off-by: Xiongfei Wei <isaacwxf23@gmail.com>
6d3fdac
to
2ae4713
Compare
Essential Elements of an Effective PR Description Checklist
supported_models.md
andexamples
for a new model.Purpose
PyTorch/XLA (starting 07/14/2025) has dropped support for python 3.10 (the current python version that vLLM on TPU is using) and the next JAX version 0.7 will also drop support Python <3.11. This PR update the python version used in TPU to 3.12, consistent with the python version used for GPU.
Test Plan
CI
Test Result
TPU CI passed.
(Optional) Documentation Update