Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lassen: Fix Chunked HDF5 with MPI #2863

Merged
merged 1 commit into from
Feb 17, 2022

Conversation

ax3l
Copy link
Member

@ax3l ax3l commented Feb 16, 2022

Try to work-around segfaults with HDF5 when running on more than one node.

  • testing

I think that is caused by: open-mpi/ompi#7795

=== If no file names and line numbers are shown below, one can run
            addr2line -Cpfie my_exefile my_line_address
    to convert `my_line_address` (e.g., 0x4a6b) into file name and line number.
    Or one can use amrex/Tools/Backtrace/parse_bt.py.

=== Please note that the line number reported by addr2line may not be accurate.
    One can use
            readelf -wl my_exefile | grep my_line_address'
    to find out the offset for that line.

 0: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x1051b690]
    amrex::BLBackTrace::print_backtrace_info(_IO_FILE*)
??:0

 1: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x1051d774]
    amrex::BLBackTrace::handler(int)
??:0

 2: [0x2000000504d8]
    __kernel_sigtramp_rt64
arch/powerpc/kernel/vdso64/sigtramp.S:30

 3: /usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/container/../lib/spectrum_mpi/mca_io_romio321.so(ADIOI_Flatten+0xbc4) [0x20001e01488c]
    ADIOI_Flatten
??:0

 4: /usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/container/../lib/spectrum_mpi/mca_io_romio321.so(ADIOI_Flatten_datatype+0x1d8) [0x20001e013c8c]
    ADIOI_Flatten_datatype
??:0

 5: /usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/container/../lib/spectrum_mpi/mca_io_romio321.so(ADIOI_Flatten_and_find+0x28) [0x20001e017c28]
    ADIOI_Flatten_and_find
??:0

 6: /usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/container/../lib/spectrum_mpi/mca_io_romio321.so(+0x2ee70) [0x20001dfcee70]
    ADIOI_Exch_and_write
??:0

 7: /usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/container/../lib/spectrum_mpi/mca_io_romio321.so(ADIOI_GPFS_WriteStridedColl+0x1bfc) [0x20001dfce428]
    ADIOI_GPFS_WriteStridedColl
??:0

 8: /usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/container/../lib/spectrum_mpi/mca_io_romio321.so(MPIOI_File_write_all+0x610) [0x20001dfbfb84]
    MPIOI_File_write_all
??:0

 9: /usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/container/../lib/spectrum_mpi/mca_io_romio321.so(mca_io_romio_dist_MPI_File_write_at_all+0x68) [0x20001dfc07f0]
    mca_io_romio_dist_MPI_File_write_at_all
??:0

10: /usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/container/../lib/spectrum_mpi/mca_io_romio321.so(mca_io_romio321_file_write_at_all+0x3c) [0x20001dfb1d7c]
    mca_io_romio321_file_write_at_all
??:0

11: /usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/lib/libmpi_ibm.so.3(PMPI_File_write_at_all+0x124) [0x2000001993b4]
    PMPI_File_write_at_all
??:0

12: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(+0x3aff10) [0x20000061ff10]
    H5FD_mpio_write
/builddir/build/BUILD/hdf5-1.10.4/src/H5FDmpio.c:1801:43

13: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5FD_write+0xcc) [0x2000003dac1c]
    H5FD_write
/builddir/build/BUILD/hdf5-1.10.4/src/H5FDint.c:257:18

14: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5F__accum_write+0x1b8) [0x2000003ae2d8]
    H5F__accum_write
/builddir/build/BUILD/hdf5-1.10.4/src/H5Faccum.c:825:12

15: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5PB_write+0xa68) [0x200000503e48]
    H5PB_write
/builddir/build/BUILD/hdf5-1.10.4/src/H5PB.c:1027:12

16: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5F_block_write+0x68) [0x2000003bc3c8]
    H5F_block_write
/builddir/build/BUILD/hdf5-1.10.4/src/H5Fio.c:164:8

17: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5D__chunk_allocate+0x12b0) [0x200000354dc0]
    H5D__chunk_collective_fill inlined at /builddir/build/BUILD/hdf5-1.10.4/src/H5Dchunk.c:4383:12 in H5D__chunk_allocate
/builddir/build/BUILD/hdf5-1.10.4/src/H5Dchunk.c:4702:8
H5D__chunk_allocate
/builddir/build/BUILD/hdf5-1.10.4/src/H5Dchunk.c:4383:12

18: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(+0xf6e48) [0x200000366e48]
    H5D__init_storage
/builddir/build/BUILD/hdf5-1.10.4/src/H5Dint.c:2242:20

19: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5D__alloc_storage+0x200) [0x20000036d9c0]
    H5D__alloc_storage
/builddir/build/BUILD/hdf5-1.10.4/src/H5Dint.c:2155:23

20: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5D__layout_oh_create+0x4a8) [0x2000003760e8]
    H5D__layout_oh_create
/builddir/build/BUILD/hdf5-1.10.4/src/H5Dlayout.c:507:12

21: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5D__create+0x20e0) [0x20000036a630]
    H5D__update_oh_info inlined at /builddir/build/BUILD/hdf5-1.10.4/src/H5Dint.c:1098:8 in H5D__create
/builddir/build/BUILD/hdf5-1.10.4/src/H5Dint.c:797:8
H5D__create
/builddir/build/BUILD/hdf5-1.10.4/src/H5Dint.c:1098:8

22: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(+0x1074fc) [0x2000003774fc]
    H5O__dset_create
/builddir/build/BUILD/hdf5-1.10.4/src/H5Doh.c:299:24

23: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5O_obj_create+0x13c) [0x20000049b9ec]
    H5O_obj_create
/builddir/build/BUILD/hdf5-1.10.4/src/H5Oint.c:2644:37

24: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(+0x1e63e4) [0x2000004563e4]
    H5L__link_cb
/builddir/build/BUILD/hdf5-1.10.4/src/H5L.c:1618:53

25: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(+0x1afb68) [0x20000041fb68]
    H5G__traverse_real
/builddir/build/BUILD/hdf5-1.10.4/src/H5Gtraverse.c:626:16

26: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5G_traverse+0xc0) [0x200000420060]
    H5G_traverse
/builddir/build/BUILD/hdf5-1.10.4/src/H5Gtraverse.c:851:9

27: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(+0x1e2e60) [0x200000452e60]
    H5L__create_real
/builddir/build/BUILD/hdf5-1.10.4/src/H5L.c:1812:8

28: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5L_link_object+0x5c) [0x2000004580fc]
    H5L_link_object
/builddir/build/BUILD/hdf5-1.10.4/src/H5L.c:1571:7

29: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5D__create_named+0x74) [0x200000367fa4]
    H5D__create_named
/builddir/build/BUILD/hdf5-1.10.4/src/H5Dint.c:325:8

30: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5Dcreate2+0x284) [0x20000033e734]
    H5Dcreate2
/builddir/build/BUILD/hdf5-1.10.4/src/H5D.c:144:24

31: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x106a5cf0]
    openPMD::HDF5IOHandlerImpl::createDataset(openPMD::Writable*, openPMD::Parameter<(openPMD::Operation)9> const&)
??:0

32: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x106aa554]
    openPMD::AbstractIOHandlerImpl::flush()
??:0

33: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x106b1bd4]
    openPMD::ParallelHDF5IOHandler::flush()
??:0

34: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x105ee570]
    openPMD::Mesh::flush_impl(std::string const&)
??:0

35: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x1011c0ac]
    openPMD::BaseRecord<openPMD::MeshRecordComponent>::flush(std::string const&)
??:0

36: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x105ddb0c]
    openPMD::Iteration::flush()
??:0

37: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x105ddef4]
    openPMD::Iteration::flushFileBased(std::string const&, unsigned long)
??:0

38: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x10631a04]
    openPMD::SeriesInterface::flushFileBased(std::_Rb_tree_iterator<std::pair<unsigned long const, openPMD::Iteration> >, std::_Rb_tree_iterator<std::pair<unsigned long const, openPMD::Iteration> >)
??:0

39: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x10632920]
    openPMD::SeriesInterface::flush_impl(std::_Rb_tree_iterator<std::pair<unsigned long const, openPMD::Iteration> >, std::_Rb_tree_iterator<std::pair<unsigned long const, openPMD::Iteration> >, openPMD::FlushLevel, bool)
??:0

40: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x106329c4]
    openPMD::SeriesInterface::flush()
??:0

41: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x10111f60]
    WarpXOpenPMDPlot::WriteOpenPMDFieldsAll(std::vector<std::string, std::allocator<std::string> > const&, amrex::Vector<amrex::MultiFab, std::allocator<amrex::MultiFab> > const&, amrex::Vector<amrex::Geometry, std::allocator<amrex::Geometry> >&, int, double, bool, amrex::Geometry const&) const
??:0

42: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x10189f50]
    FlushFormatOpenPMD::WriteToFile(amrex::Vector<std::string, std::allocator<std::string> >, amrex::Vector<amrex::MultiFab, std::allocator<amrex::MultiFab> > const&, amrex::Vector<amrex::Geometry, std::allocator<amrex::Geometry> >&, amrex::Vector<int, std::allocator<int> >, double, amrex::Vector<ParticleDiag, std::allocator<ParticleDiag> > const&, int, std::string, int, bool, bool, bool, int, amrex::Geometry const&, bool) const
??:0

43: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x100c5c30]
    FullDiagnostics::Flush(int)
??:0

44: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x100bf9ac]
    Diagnostics::FilterComputePackFlush(int, bool)
??:0

45: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x100ca5d0]
    MultiDiagnostics::FilterComputePackFlush(int, bool)
??:0

46: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x10287c9c]
    WarpX::InitData()
??:0

47: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x1001d3f4]
    main
??:0

48: /lib64/libc.so.6(+0x25300) [0x200000b35300]
    generic_start_main
../csu/libc-start.c:266

49: /lib64/libc.so.6(__libc_start_main+0xc4) [0x200000b354f4]
    __libc_start_main
../sysdeps/unix/sysv/linux/powerpc/libc-start.c:81


===== TinyProfilers ======
main()
WarpX::InitData()
Diagnostics::FilterComputePackFlush()
FlushFormatOpenPMD::WriteToFile()
WarpXOpenPMDPlot::WriteOpenPMDFields()

Try to work-around segfaults with HDF5 when running on more than one
node.
@ax3l ax3l added component: openPMD openPMD I/O machine / system Machine or system-specific issue workaround labels Feb 16, 2022
@EZoni EZoni merged commit a6f876a into ECP-WarpX:development Feb 17, 2022
@ax3l ax3l deleted the fix-hdf5ChunkedLassen branch February 17, 2022 04:43
@ax3l
Copy link
Member Author

ax3l commented Feb 17, 2022

Orthogonal to this, we see the following IBM MPI error now:

warpx.2d.MPI.CUDA.DP.OPMD.QED: ../../ccmi/executor/HybridFlatAllgather.h:206: void CCMI::Executor::HybridFlatAllgather<T_ConnMgr, T_Barrier, T_Type>::shortCopyOUT() [with T_ConnMgr = CCMI::ConnectionManager::CommSeqConnMgr; T_Barrier = CCMI::Executor::ShmemBarrier; int T_Type = 1]: Assertion `_rbuf != __null' failed.
SIGABRT
Segfault

Backtrace:

=== If no file names and line numbers are shown below, one can run
            addr2line -Cpfie my_exefile my_line_address
    to convert `my_line_address` (e.g., 0x4a6b) into file name and line number.
    Or one can use amrex/Tools/Backtrace/parse_bt.py.

=== Please note that the line number reported by addr2line may not be accurate.
    One can use
            readelf -wl my_exefile | grep my_line_address'
    to find out the offset for that line.

 0: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x1051b690]
    amrex::BLBackTrace::print_backtrace_info(_IO_FILE*)
??:0

 1: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x1051d774]
    amrex::BLBackTrace::handler(int)
??:0

 2: [0x2000000504d8]
    __kernel_sigtramp_rt64
arch/powerpc/kernel/vdso64/sigtramp.S:30

 3: /lib64/libc.so.6(abort+0x2b4) [0x200000b52134]
    __GI_abort
/usr/src/debug/glibc-2.17-c758a686/stdlib/abort.c:75

 4: /lib64/libc.so.6(+0x357d4) [0x200000b457d4]
    __assert_fail_base
/usr/src/debug/glibc-2.17-c758a686/assert/assert.c:92

 5: /lib64/libc.so.6(__assert_fail+0x64) [0x200000b458c4]
    __GI___assert_fail
/usr/src/debug/glibc-2.17-c758a686/assert/assert.c:101

 6: /usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/container/../lib/pami_port/libcollectives.so.3(_ZN4CCMI8Executor19HybridFlatAllgatherINS_17ConnectionManager14CommSeqConnMgrENS0_12ShmemBarrierELi1EE12shortCopyOUTEv+0x1cc) [0x20000308552c]
    CCMI::Executor::HybridFlatAllgather<CCMI::ConnectionManager::CommSeqConnMgr, CCMI::Executor::ShmemBarrier, 1>::shortCopyOUT()
../../ccmi/executor/HybridFlatAllgather.h:206

 7: /usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/container/../lib/pami_port/libcollectives.so.3(_ZN4CCMI8Executor19HybridFlatAllgatherINS_17ConnectionManager14CommSeqConnMgrENS0_12ShmemBarrierELi1EE10advance_fnILb1EEE16libcoll_result_tPvS8_+0x398) [0x200003085c28]
    CCMI::Executor::HybridFlatAllgather<CCMI::ConnectionManager::CommSeqConnMgr, CCMI::Executor::ShmemBarrier, 1>::short_advance() inlined at ../../ccmi/executor/HybridFlatAllgather.h:352 in libcoll_result_t CCMI::Executor::HybridFlatAllgather<CCMI::ConnectionManager::CommSeqConnMgr, CCMI::Executor::ShmemBarrier, 1>::advance_fn<true>(void*, void*)
../../ccmi/executor/HybridFlatAllgather.h:806
libcoll_result_t CCMI::Executor::HybridFlatAllgather<CCMI::ConnectionManager::CommSeqConnMgr, CCMI::Executor::ShmemBarrier, 1>::advance_fn<true>(void*, void*)
../../ccmi/executor/HybridFlatAllgather.h:352

 8: /usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/container/../lib/pami_port/libcollectives.so.3(_ZN7LibColl18NativeInterfaceP2PILb1ELb0EE12call_work_fnEPvS2_+0x2c) [0x200002ff4ccc]
    LibColl::NativeInterfaceP2P<true, false>::call_work_fn(void*, void*)
../../adapter/pami/NativeInterface.h:347

 9: /usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/container/../lib/pami_port/libpami.so.3(PAMI_Context_advancev+0x128) [0x200002a28b38]
    PAMI_Context_advancev
??:0

10: /usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/container/../lib/pami_port/libcollectives.so.3(LIBCOLL_Advance_pami+0x34) [0x200002fe99c4]
    LIBCOLL_Advance_pami
/__SMPI_build_dir_______________________________________/ibmsrc/mini_libcoll/src/adapter/pami/api.cc:87

11: /usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/container/../lib/pami_port/libcollectives.so.3(LIBCOLL_Advance+0x18) [0x200002fe3b58]
    LIBCOLL_Advance
/__SMPI_build_dir_______________________________________/ibmsrc/mini_libcoll/src/adapter/libcoll.cc:133

12: /usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/container/../lib/spectrum_mpi/mca_coll_ibm.so(start_libcoll_blocking_collective+0x168) [0x200002eed1b8]
    start_libcoll_blocking_collective
??:0

13: /usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/container/../lib/spectrum_mpi/mca_coll_ibm.so(mca_coll_ibm_allgatherv+0x3c8) [0x200002eee2c8]
    mca_coll_ibm_allgatherv
??:0

14: /usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/container/../lib/spectrum_mpi/mca_fcoll_vulcan.so(mca_fcoll_vulcan_file_write_all+0xc50) [0x20001eb17670]
    mca_fcoll_vulcan_file_write_all
??:0

15: /usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/lib/libmca_common_ompio.so.3(mca_common_ompio_file_write_at_all+0x88) [0x20001e9fa848]
    mca_common_ompio_file_write_at_all
??:0

16: /usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/container/../lib/spectrum_mpi/mca_io_ompio.so(mca_io_ompio_file_write_at_all+0x44) [0x20001e9c71f4]
    mca_io_ompio_file_write_at_all
??:0

17: /usr/tce/packages/spectrum-mpi/ibm/spectrum-mpi-rolling-release/lib/libmpi_ibm.so.3(PMPI_File_write_at_all+0x124) [0x2000001993b4]
    PMPI_File_write_at_all
??:0

18: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(+0x3aff10) [0x20000061ff10]
    H5FD_mpio_write
/builddir/build/BUILD/hdf5-1.10.4/src/H5FDmpio.c:1801:43

19: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5FD_write+0xcc) [0x2000003dac1c]
    H5FD_write
/builddir/build/BUILD/hdf5-1.10.4/src/H5FDint.c:257:18

20: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5F__accum_write+0x1b8) [0x2000003ae2d8]
    H5F__accum_write
/builddir/build/BUILD/hdf5-1.10.4/src/H5Faccum.c:825:12

21: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5PB_write+0xa68) [0x200000503e48]
    H5PB_write
/builddir/build/BUILD/hdf5-1.10.4/src/H5PB.c:1027:12

22: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5F_block_write+0x68) [0x2000003bc3c8]
    H5F_block_write
/builddir/build/BUILD/hdf5-1.10.4/src/H5Fio.c:164:8

23: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5D__chunk_allocate+0x12b0) [0x200000354dc0]
    H5D__chunk_collective_fill inlined at /builddir/build/BUILD/hdf5-1.10.4/src/H5Dchunk.c:4383:12 in H5D__chunk_allocate
/builddir/build/BUILD/hdf5-1.10.4/src/H5Dchunk.c:4702:8
H5D__chunk_allocate
/builddir/build/BUILD/hdf5-1.10.4/src/H5Dchunk.c:4383:12

24: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(+0xf6e48) [0x200000366e48]
    H5D__init_storage
/builddir/build/BUILD/hdf5-1.10.4/src/H5Dint.c:2242:20

25: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5D__alloc_storage+0x200) [0x20000036d9c0]
    H5D__alloc_storage
/builddir/build/BUILD/hdf5-1.10.4/src/H5Dint.c:2155:23

26: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5D__layout_oh_create+0x4a8) [0x2000003760e8]
    H5D__layout_oh_create
/builddir/build/BUILD/hdf5-1.10.4/src/H5Dlayout.c:507:12

27: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5D__create+0x20e0) [0x20000036a630]
    H5D__update_oh_info inlined at /builddir/build/BUILD/hdf5-1.10.4/src/H5Dint.c:1098:8 in H5D__create
/builddir/build/BUILD/hdf5-1.10.4/src/H5Dint.c:797:8
H5D__create
/builddir/build/BUILD/hdf5-1.10.4/src/H5Dint.c:1098:8

28: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(+0x1074fc) [0x2000003774fc]
    H5O__dset_create
/builddir/build/BUILD/hdf5-1.10.4/src/H5Doh.c:299:24

29: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5O_obj_create+0x13c) [0x20000049b9ec]
    H5O_obj_create
/builddir/build/BUILD/hdf5-1.10.4/src/H5Oint.c:2644:37

30: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(+0x1e63e4) [0x2000004563e4]
    H5L__link_cb
/builddir/build/BUILD/hdf5-1.10.4/src/H5L.c:1618:53

31: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(+0x1afb68) [0x20000041fb68]
    H5G__traverse_real
/builddir/build/BUILD/hdf5-1.10.4/src/H5Gtraverse.c:626:16

32: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5G_traverse+0xc0) [0x200000420060]
    H5G_traverse
/builddir/build/BUILD/hdf5-1.10.4/src/H5Gtraverse.c:851:9

33: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(+0x1e2e60) [0x200000452e60]
    H5L__create_real
/builddir/build/BUILD/hdf5-1.10.4/src/H5L.c:1812:8

34: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5L_link_object+0x5c) [0x2000004580fc]
    H5L_link_object
/builddir/build/BUILD/hdf5-1.10.4/src/H5L.c:1571:7

35: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5D__create_named+0x74) [0x200000367fa4]
    H5D__create_named
/builddir/build/BUILD/hdf5-1.10.4/src/H5Dint.c:325:8

36: /usr/tce/packages/hdf5/hdf5-parallel-1.10.4-gcc-8.3.1-spectrum-mpi-rolling-release/lib/libhdf5.so.103(H5Dcreate2+0x284) [0x20000033e734]
    H5Dcreate2
/builddir/build/BUILD/hdf5-1.10.4/src/H5D.c:144:24

37: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x106a5cf0]
    openPMD::HDF5IOHandlerImpl::createDataset(openPMD::Writable*, openPMD::Parameter<(openPMD::Operation)9> const&)
??:0

38: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x106aa554]
    openPMD::AbstractIOHandlerImpl::flush()
??:0

39: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x106b1bd4]
    openPMD::ParallelHDF5IOHandler::flush()
??:0

40: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x105ee570]
    openPMD::Mesh::flush_impl(std::string const&)
??:0

41: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x1011c0ac]
    openPMD::BaseRecord<openPMD::MeshRecordComponent>::flush(std::string const&)
??:0

42: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x105ddb0c]
    openPMD::Iteration::flush()
??:0

43: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x105ddef4]
    openPMD::Iteration::flushFileBased(std::string const&, unsigned long)
??:0

44: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x10631a04]
    openPMD::SeriesInterface::flushFileBased(std::_Rb_tree_iterator<std::pair<unsigned long const, openPMD::Iteration> >, std::_Rb_tree_iterator<std::pair<unsigned long const, openPMD::Iteration> >)
??:0

45: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x10632920]
    openPMD::SeriesInterface::flush_impl(std::_Rb_tree_iterator<std::pair<unsigned long const, openPMD::Iteration> >, std::_Rb_tree_iterator<std::pair<unsigned long const, openPMD::Iteration> >, openPMD::FlushLevel, bool)
??:0

46: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x106329c4]
    openPMD::SeriesInterface::flush()
??:0

47: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x10111f60]
    WarpXOpenPMDPlot::WriteOpenPMDFieldsAll(std::vector<std::string, std::allocator<std::string> > const&, amrex::Vector<amrex::MultiFab, std::allocator<amrex::MultiFab> > const&, amrex::Vector<amrex::Geometry, std::allocator<amrex::Geometry> >&, int, double, bool, amrex::Geometry const&) const
??:0

48: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x10189f50]
    FlushFormatOpenPMD::WriteToFile(amrex::Vector<std::string, std::allocator<std::string> >, amrex::Vector<amrex::MultiFab, std::allocator<amrex::MultiFab> > const&, amrex::Vector<amrex::Geometry, std::allocator<amrex::Geometry> >&, amrex::Vector<int, std::allocator<int> >, double, amrex::Vector<ParticleDiag, std::allocator<ParticleDiag> > const&, int, std::string, int, bool, bool, bool, int, amrex::Geometry const&, bool) const
??:0

49: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x100c5c30]
    FullDiagnostics::Flush(int)
??:0

50: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x100bf9ac]
    Diagnostics::FilterComputePackFlush(int, bool)
??:0

51: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x100ca5d0]
    MultiDiagnostics::FilterComputePackFlush(int, bool)
??:0

52: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x10287c9c]
    WarpX::InitData()
??:0

53: src/warpx/build/bin/warpx.2d.MPI.CUDA.DP.OPMD.QED() [0x1001d3f4]
    main
??:0

54: /lib64/libc.so.6(+0x25300) [0x200000b35300]
    generic_start_main
../csu/libc-start.c:266

55: /lib64/libc.so.6(__libc_start_main+0xc4) [0x200000b354f4]
    __libc_start_main
../sysdeps/unix/sysv/linux/powerpc/libc-start.c:81


===== TinyProfilers ======
main()
WarpX::InitData()
Diagnostics::FilterComputePackFlush()
FlushFormatOpenPMD::WriteToFile()
WarpXOpenPMDPlot::WriteOpenPMDFields()

@ax3l
Copy link
Member Author

ax3l commented Feb 17, 2022

For the last issue, we could try to use

  export OMPI_MCA_coll_ibm_skip_barrier=true

to mitigate collectives as we do on Summit, tests pending. On Summit, the problem was with barriers, here it's an AllGather it seams. Maybe there is a separate flag for that one.

@ax3l
Copy link
Member Author

ax3l commented Feb 17, 2022

@joshualudwig8 can you please try with the latest WarpX if setting this environment variable helps with the HDF5 crashes on Lassen?

@ax3l
Copy link
Member Author

ax3l commented Feb 17, 2022

Actually, I found a guide: https://www.ibm.com/docs/en/SSZTET_EOS/eos/guide_101.pdf

Let's try this:

  export OMPI_MCA_coll_ibm_skip_allgatherv=true

@joshualudwig8
Copy link

joshualudwig8 commented Feb 17, 2022 via email

@ax3l
Copy link
Member Author

ax3l commented Feb 18, 2022

Awesome, will open another PR to document this!

Buggy IBM implementations everywhere, hah.

@ax3l
Copy link
Member Author

ax3l commented Feb 18, 2022

Documentation update incoming via #2874

roelof-groenewald added a commit to ModernElectron/WarpX that referenced this pull request Feb 19, 2022
* NCIGodfreyFilter: Fix Int Division (ECP-WarpX#2837)

* NCIGodfreyFilter: Fix Int Division

`m_cdtodz` is between 0 and 1, and we interpolate a set of
coefficients from a table.

* reset benchmarks

Co-authored-by: Tools <warpx@lbl.gov>

* Adding documentation for lxplus (ECP-WarpX#2756)

* Adding documentation for lxplus

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Adding the new documentation

* Apply suggestions from code review

Some suggestions from the code review.

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Removing suggestion about miniconda

* Switched to using anonymous environment

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Fixing the architecture

* Apply suggestions from code review

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Update Tools/machines/lxplus-cern/spack.yaml

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Updating env to use pre-installed openmpi

* Moving a file

* Docs: Now using GCC 11.2.0

* Add OpenMPI Version in Spec

* Add CPU target architecture note

* One more GCC 9.2.0->11.2.0 Update

* Finalize Spack Stack Setup Notes

* Move AFS Spack Config to Environment File

* Comment on Variations of the Spack Env

* Improve git clone and spack activation

* Apply suggestions from code review

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Fixing ncurses

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Fix: Spack Concretization (CUDA/Python)

Somehow, this concretizes the variants not properly otherwise.

* Not using the preinstalled openmpi anymore

* updated lxplus.rst

* added lxplus_warpx.profile.example

* Apply suggestions from code review

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Apply suggestions from code review

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Updated documentation

* Added a precisation about warpx.profile

* Apply suggestions from code review

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Add missing empty newline

Co-authored-by: lgiacome <lorenzo.giacome@cern.ch>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* BackTransformParticleFunctor: Unused Counter (ECP-WarpX#2840)

Remove an unused counter in BTD particle filtering.

Seen first with a HIP diagnostics.

* ABLASTR: particle weights `const` (ECP-WarpX#2838)

* ABLASTR: particle weights `const`

We can declare the particle weights `const` because we don't
change values in them during deposition.

* DepositCharge: `const`-ify usage

* Rename ngE as ngEB (used for E,B) (ECP-WarpX#2841)

* Fix some offsets with the gather buffers (ECP-WarpX#2847)

* Add amrex REPO and BRANCH flags for python builds (ECP-WarpX#2845)

* Add WarpX_amrex_repo and _branch options to Python

In python setup, environment variables WARPX_AMREX_REPO and
WARPX_AMREX_BRANCH will now set these variables

* Update documentation with new compile envvars

* User-defined integer and real particle attributes (ECP-WarpX#2735)

* define user attributes, parse them, initialize with respective parsers

* fix warning by using static_cast for int attribute as parser returns real

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* clean-up from self-review

* adding dimensionless velocity, gamma*v/c and time to parser argument

* add documentation

* typo in comment

* unused var

* device vector for kernels

* particle attribute in developer doc

* data ptr for device vector

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* ignore_unused

* Docs: Describe all particle attributes

including pre-defined ones :)

* Docs: Fix formatting (user params)

* Add: 1D and RZ Support

* Docs: Fix Typo in Function Declaration

* Laser-Ion Example: User-Defined Attrib.

Add two user-defined attributes to the laser-ion acceleration
example. This is a 2D test.

Documents the name in the table of commonly used, user-defined
attribute names. The attribute added is the original position
of particles, which I like to plot in "potential" plots that
correlate original position in the target with final energy.

* changing user-interface API with .attribute. and no need for separate 1D 2D 3D RZ code for parser. pos.x/y/z returns the right values

* Adding 1D, 3D, and rz tests

* attribute in inputs

* at(i)

* refinining names for inputs for laser ion and acceleration tests

* typo in input

* reset benchmarks for test-cases that included attributes

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Don't cut all particles in a Gaussian beam when x_rms=0 (ECP-WarpX#2844)

* [pre-commit.ci] pre-commit autoupdate (ECP-WarpX#2851)

updates:
- [github.com/Lucas-C/pre-commit-hooks: v1.1.11 → v1.1.12](Lucas-C/pre-commit-hooks@v1.1.11...v1.1.12)

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>

* Use parser to read laser spatio-temporal couplings direction (ECP-WarpX#2843)

* Only set modified k to 0 for even number of points (ECP-WarpX#2852)

* Only set modified k to 0 for even number of points

* Update Source/FieldSolver/SpectralSolver/SpectralKSpace.cpp

* Allow flux injection in the out-of-plane direction for RZ/2D geometry (ECP-WarpX#2788)

* Implement injection orthogal to plane

* Generalize momentum distribution for flux injection

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Revert "[pre-commit.ci] auto fixes from pre-commit.com hooks"

This reverts commit b0cd189.

* Revert "Generalize momentum distribution for flux injection"

This reverts commit 0a22b1d.

* Rotate momentum initialization

* Correct flux number when the direction is normal to plane

* Update distribution of particles within a cell

* Clean-up injection code

* Add more documentation

* Add more comments

* Handle 1D case

* Only do the rotation for Gaussian flux profile

* Fix compilation error

* Correct compilation for GPU

* Start adding automated test

* Correct sign of velocity

* Update to add continuous injection

* Finalize test

* Correct processing of flux_normal_axis

* Add checksum

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Fix bug

* Update script

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Update checksum

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>

* AMReX/PICSAR: Weekly Update (ECP-WarpX#2849)

* AMReX: Update latest commit

* AMReX: Update to 76d08651adb987e3fba6b232e806c5e7c365a8d9

* update CI to use ascent 0.8.0 release container (ECP-WarpX#2858)

* use ascent 0.8.0 release container

* try again

* restore

* use new install loc

* Correct typo in the relativistic Poisson solver (ECP-WarpX#2853)

* Correct typo in the relativistic Poisson solver

* Fix unused variable

* Update benchmark

* Gaussian particle beam: add error message when using y_rms = 0 in 2D (ECP-WarpX#2862)

* ParticleBuffer: Generalize & Move (ECP-WarpX#2860)

* ParticleBuffer: Generalize & Move

- move the `ParticleBuffer` to ABLASTR
- generalize the API
- remove `amr_core` argument
- use more semantic naming
- add docs

* Use `amrex::ParticleContainer::make_alike`

* Update AMREX

to include AMReX-Codes/amrex#2630

* ABLASTR: Refactor `deposit_charge` API (ECP-WarpX#2856)

Simplified and re-ordered interface for
`ablastr::particles::deposit_charge`.

* Provide `t_min` and `t_max` for flux injection (ECP-WarpX#2842)

* Implement injection orthogal to plane

* Generalize momentum distribution for flux injection

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Revert "[pre-commit.ci] auto fixes from pre-commit.com hooks"

This reverts commit b0cd189.

* Revert "Generalize momentum distribution for flux injection"

This reverts commit 0a22b1d.

* Rotate momentum initialization

* Correct flux number when the direction is normal to plane

* Update distribution of particles within a cell

* Clean-up injection code

* Add more documentation

* Add more comments

* Handle 1D case

* Only do the rotation for Gaussian flux profile

* Fix compilation error

* Correct compilation for GPU

* Start adding automated test

* Correct sign of velocity

* Update to add continuous injection

* Finalize test

* Correct processing of flux_normal_axis

* Add checksum

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Fix bug

* Update script

* Implement maximum injection time

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Add parameter tmin

* Make parameter optional ; update documentation

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>

* Lassen: Fix Chunked HDF5 with MPI (ECP-WarpX#2863)

Try to work-around segfaults with HDF5 when running on more than one node.

* Refactor Current Correction Functions (ECP-WarpX#2839)

* Refactor Current Correction Functions

* Clean Up, Reset Benchmark

* Rotate momentum for RZ flux injection (ECP-WarpX#2867)

* Add warning to FieldProbe re: Boosted Frame (ECP-WarpX#2868)

* Add warning to FieldProbe re: Boosted Frames

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Update parsing of FieldProbe in 2D and 1D (ECP-WarpX#2818)

* Update parsing of FieldProbe in 2D and 1D

* Fix unused variables

* Allow plane probe in 2D and line probe in 1D

* doc update

* Do Not Fill PML Guard Cells w/ Inverse FFTs (ECP-WarpX#2854)

* Fix number of guard cell for coarse patch (ECP-WarpX#2869)

* openPMD: Add ADIOS2 Engine Parameter Control (ECP-WarpX#2872)

* Adds support for ADIOS engines ECP-WarpX#2866

Input file can now have lines like
diag1.adios2_engine.parameters.NumAggregators=2

* Docs for ADIOS engine type and parameters ECP-WarpX#2866

* Aesthetic edit in adios2 engine documentation ECP-WarpX#2866

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* Removed debug print statement ECP-WarpX#2866

* Style Updates

Co-authored-by: Mehta, Kshitij V <kshitij-v-mehta@github.com>
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Add `PHistDiag` for scraping (#153)

* refactor of surface flux diagnostic handling before implementing PHistDiag

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* initial commit of `ParticleHistDiag`

* updated changelog and version number

* added plotting functionality specifically for ZPlane assemblies

* save histogram binning details to file as well

* code cleanup

* changes requested during PR review

* add comment about 2d plotting

* remove debugging print statement

* further code changes from PR review

* Fix typo

Co-authored-by: Peter Scherpelz <31747262+peterscherpelz@users.noreply.github.com>

* fix another typo

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Peter Scherpelz <31747262+peterscherpelz@users.noreply.github.com>

* Add initialization of pairwise Coulomb collisions (#155)

* refactor of surface flux diagnostic handling before implementing PHistDiag

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* initial commit of `ParticleHistDiag`

* updated changelog and version number

* added plotting functionality specifically for ZPlane assemblies

* save histogram binning details to file as well

* code cleanup

* changes requested during PR review

* add comment about 2d plotting

* added Coulomb collision installation to picmi.py

* added pairwise Coulomb collision initialization

* remove debugging print statement

* further code changes from PR review

* Fix typo

Co-authored-by: Peter Scherpelz <31747262+peterscherpelz@users.noreply.github.com>

* fix docstring

* cleaned up the logging message for Coulomb scattering

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Peter Scherpelz <31747262+peterscherpelz@users.noreply.github.com>

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
Co-authored-by: Tools <warpx@lbl.gov>
Co-authored-by: Lorenzo Giacomel <47607756+lgiacome@users.noreply.github.com>
Co-authored-by: lgiacome <lorenzo.giacome@cern.ch>
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Edoardo Zoni <59625522+EZoni@users.noreply.github.com>
Co-authored-by: Neïl Zaim <49716072+NeilZaim@users.noreply.github.com>
Co-authored-by: Peter Scherpelz <31747262+peterscherpelz@users.noreply.github.com>
Co-authored-by: Revathi  Jambunathan <41089244+RevathiJambunathan@users.noreply.github.com>
Co-authored-by: Remi Lehe <remi.lehe@normalesup.org>
Co-authored-by: Cyrus Harrison <cyrush@llnl.gov>
Co-authored-by: Tiberius Rheaume <35204125+TiberiusRheaume@users.noreply.github.com>
Co-authored-by: Kshitij Mehta <kshitij-v-mehta@users.noreply.github.com>
Co-authored-by: Mehta, Kshitij V <kshitij-v-mehta@github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
component: openPMD openPMD I/O machine / system Machine or system-specific issue workaround
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants