Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature request: Reduce memory usage for forward operators MPI window #718

Closed
hkershaw-brown opened this issue Aug 13, 2024 · 2 comments · Fixed by #735
Closed

Feature request: Reduce memory usage for forward operators MPI window #718

hkershaw-brown opened this issue Aug 13, 2024 · 2 comments · Fixed by #735
Assignees

Comments

@hkershaw-brown
Copy link
Member

hkershaw-brown commented Aug 13, 2024

Use case

Currently doing an copy of data from the ensemble handle to the window

Is your feature request related to a problem?

Not incorrect results, but using more memory per core than we need to.

Describe your preferred solution

Create the window with the whole state_ens_handle%copies array. 'simply contiguous' 1
get_state copies_in_window != copies you need to get so the window logic would need to be updated for the whole %copies array

  1. MPI_Win_create In Fortran, one can pass the first element of a memory region or a whole array, which must be 'simply contiguous' (for 'simply contiguous', see also MPI 3.0, Section 17.1.12 on page 626).
    https://www.mpi-forum.org/docs/mpi-3.1/mpi31-report.pdf

Spec

Describe any alternatives you have considered

None

@hkershaw-brown
Copy link
Member Author

this is an aside, but there are unused copies_in_window in CM1 model_mod & filter_mod (& filter_dopplerfold)

use ensemble_manager_mod, only : ensemble_type, copies_in_window

map_pe_to_task, copies_in_window, set_num_extra_copies, &

pull these out.

@hkershaw-brown
Copy link
Member Author

hkershaw-brown commented Aug 13, 2024

note this is why there is a copy:

! Global memory to stick the mpi window to.
! Need a simply contiguous piece of memory to pass to mpi_win_create
! Openmpi 1.10.0 will not compile with ifort 16 if
! you create a window with a 2d array.

https://github.com/NCAR/DART/blob/75cf8dc9c566221f624ffd4d5eeba9fde5a1757c/assimilation_code/modules/utilities/no_cray_win_mod.f90#L43C1-L46C39

@hkershaw-brown hkershaw-brown self-assigned this Sep 6, 2024
hkershaw-brown added a commit that referenced this issue Sep 6, 2024
using external fortran-testanything
model_mod.f90 is in the work directory. This is a lazy move to avoid fiddling with EXTRA in quickbuild.sh

for issue #718
hkershaw-brown added a commit that referenced this issue Sep 6, 2024
fixes #718

Note I have only done mpi_utilities_mod and not f08 mpi utilites yet.
The tests pass on gfortran, I have not tried intel yet because I do not want to be
disappinted
hkershaw-brown added a commit that referenced this issue Sep 10, 2024
using external fortran-testanything
model_mod.f90 is in the work directory. This is a lazy move to avoid fiddling with EXTRA in quickbuild.sh

for issue #718
hkershaw-brown added a commit that referenced this issue Sep 10, 2024
fixes #718

Note I have only done mpi_utilities_mod and not f08 mpi utilites yet.
The tests pass on gfortran, I have not tried intel yet because I do not want to be
disappinted
hkershaw-brown added a commit that referenced this issue Sep 10, 2024
mpi version with all of %copies in the window
mpif08 version with all of %copies in the window
null_mpi_utilities_mod.f90 version with matchin arguments to the mpi version (does nothing)

fixes #718
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant