Skip to content

"RuntimeError: Expected all tensors to be on the same device" faced when running tutorial #2903

Closed Answered by Balandat
gkstmdwn asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @gkstmdwn thanks for flagging - yes, there is an issue here. For one, we should put the problem on the correct device:

neg_hartmann6 = Hartmann(negate=True).to(device=device)

I also found another issue where the random points were not generated on the GPU:

def update_random_observations(best_random):
    """Simulates a random policy by taking a the current list of best values observed randomly,
    drawing a new random point, observing its value, and updating the list.
    """
    rand_x = torch.rand(BATCH_SIZE, 6, device=device)   <-- note the `device` here
    next_random_best = weighted_obj(rand_x).max().item()
    best_random.append(max(best_random[-1], next_random_best))
    retu…

Replies: 1 comment 2 replies

Comment options

You must be logged in to vote
2 replies
@gkstmdwn
Comment options

@Balandat
Comment options

Balandat Jul 2, 2025
Collaborator

Answer selected by gkstmdwn
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants