Skip to content

Call MPI_AllReduce instead of C-wrapper for the 1D-recvbuf #92

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Apr 2, 2025

Conversation

adit4443ya
Copy link
Collaborator

No description provided.

@adit4443ya
Copy link
Collaborator Author

allreduce_4.f90 is test for this

Copy link
Collaborator

@gxyd gxyd left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks great, left one comment.

src/mpi.f90 Outdated
c_op = c_mpi_op_f2c(op)
c_comm = c_mpi_comm_f2c(comm)

local_ierr = c_mpi_allreduce_scalar(sendbuf_ptr, recvbuf_ptr, count, c_datatype, c_op, c_comm)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

to avoid ambiguity, can we rename this to c_mpi_allreduce? (and similarly the call in MPI_Allreduce_scalar as well)

src/mpi.f90 Outdated
ierror = local_ierr
else
if (local_ierr /= MPI_SUCCESS) then
print *, "MPI_Allreduce_scalar failed with error code: ", local_ierr
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, we can rename this to MPI_Allreduce_1D_recv_proc maybe?

@gxyd gxyd merged commit 1b47ef9 into lfortran:main Apr 2, 2025
20 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants