MPCD using MPI
#1887
Replies: 1 comment 14 replies
-
As the error message states, the snapshot data is present only on rank 0. The MPI tutorial shows how you can access global snapshot data in an |
Beta Was this translation helpful? Give feedback.
14 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am trying to run the hoomd-mpcd routine using multiple CPUs or GPUs.
I managed to run mpcd simulations with a single rank, both on CPU and on GPU,
but the script I used does not run on multiple ranks.
As far as I understand, I can simply run the same script written for a single core, can't I?
I would be very grateful if you could help me.
Here is the code (test_hoomd_mpcd.py) that I am using:
On the HPC I am using, I run
srun --exact -n 4 --gres=gpu:4 python test_hoomd_mpcd.py
And the error messages I got:
Thank you very much!!!
Beta Was this translation helpful? Give feedback.
All reactions