Skip to content

Commit 72ad273

Browse files
authored
Remove torch_xla.tpu.version() from pallas.py. (#21065)
Signed-off-by: Qiliang Cui <derrhein@gmail.com>
1 parent 01513a3 commit 72ad273

File tree

1 file changed

+0
-4
lines changed

1 file changed

+0
-4
lines changed

vllm/v1/attention/backends/pallas.py

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -167,10 +167,6 @@ def __init__(
167167
"are not implemented for "
168168
"PallasAttentionBackendImpl")
169169

170-
tpu_version = torch_xla.tpu.version()
171-
if tpu_version < 4:
172-
raise NotImplementedError("TPU version must be 4 or higher.")
173-
174170
def forward(
175171
self,
176172
layer: AttentionLayer,

0 commit comments

Comments
 (0)