Skip to content

No labels!

There aren’t any labels for this repository quite yet.

documentation
documentation
duplicate
duplicate
This issue or pull request already exists
dynamism
dynamism
Dynamic Shape Features
eager
eager
PyTorch/XLA eager-mode
enhancement
enhancement
New feature or request
flaky
flaky
Issues due to flaky tests.
functionalization-disabled
functionalization-disabled
Issues specifically for when functionalization is disabled.
good first issue
good first issue
Good for newcomers
gradient_checkpointing
gradient_checkpointing
high priority
high priority
Issues the team would like to fix quickly.
install
install
PyTorch/XLA installation related issues.
lowering
lowering
ATen Operation lowering
multiprocessing
multiprocessing
needs reproduction
needs reproduction
nostale
nostale
Do not consider for staleness
pytorch breaking
pytorch breaking
Upstream PyTorch breakage w.r.t. PyTorch/XLA
pytorch bug
pytorch bug
This bug (likely) requires changes in PyTorch to address
pytorch divergence
pytorch divergence
XLA behavior doesn't match Pytorch eager frontend
quantization
quantization