Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

openPMD-api: 0.14.2 #2150

Merged
merged 1 commit into from
Aug 18, 2021
Merged

Conversation

ax3l
Copy link
Member

@ax3l ax3l commented Jul 30, 2021

This updates our requirements to openPMD-api 0.14.2+

Among others, this release introduced resizable data sets, which we will soon use for particle backtransformed diagnostics.

@ax3l ax3l added the component: openPMD openPMD I/O label Jul 30, 2021
@ax3l

This comment has been minimized.

@ax3l ax3l force-pushed the topic-openPMD-0.14.0 branch from 9d72282 to 7a3f400 Compare August 2, 2021 16:55
@ax3l ax3l changed the title [WIP] openPMD-api: 0.14.0 [WIP] openPMD-api: 0.14.1 Aug 3, 2021
@ax3l ax3l force-pushed the topic-openPMD-0.14.0 branch from 7a3f400 to 7271772 Compare August 5, 2021 01:24
@ax3l ax3l requested review from guj and RevathiJambunathan August 5, 2021 01:24
@lgtm-com

This comment has been minimized.

@lgtm-com

This comment has been minimized.

@ax3l
Copy link
Member Author

ax3l commented Aug 5, 2021

Interesting, I see no issue locally with HDF5 1.10.7 for LaserIonAcc2d/qed_breit_wheeler_2d_opmd (picked one of them)
On CI, with HDF5 1.12.0, the latest release fails...

Locally compiled and run with CMake it does not crash... Locally compiled with GNUmake via run_test.sh it does crash...

@ax3l ax3l force-pushed the topic-openPMD-0.14.0 branch 2 times, most recently from 24eee44 to 8f57cc9 Compare August 6, 2021 18:38
@ax3l
Copy link
Member Author

ax3l commented Aug 7, 2021

Maybe the leftover problem is a mismatch between GCC version for HDF5/openPMD and WarpX in run_test.sh: openPMD/openPMD-api#979 (comment)

@ax3l
Copy link
Member Author

ax3l commented Aug 13, 2021

The problem in the end was #2193

WeiqunZhang pushed a commit that referenced this pull request Aug 13, 2021
Make sure we do not return an empty string in `WarpX::Version()`. In some situations, e.g., in CI/run_test.sh, the macro for the `WARPX_GIT_VERSION` version in GNUmake is set but empty.
`
Since empty versions are problematic for HDF5 attributes and confusing anyway, we return a proper `"Unknown"` in such a situation now, too.

Detected as bug in #2150
X-ref:
- openPMD/openPMD-api#1087
- openPMD/openPMD-api#979
@ax3l ax3l force-pushed the topic-openPMD-0.14.0 branch from 3c648bc to df805d7 Compare August 13, 2021 15:43
@ax3l ax3l changed the title [WIP] openPMD-api: 0.14.1 [WIP] openPMD-api: 0.14.2 Aug 13, 2021
@ax3l ax3l force-pushed the topic-openPMD-0.14.0 branch from df805d7 to 630d7ba Compare August 18, 2021 04:17
This updates our requirements to openPMD-api 0.14.2+

Among others, this release introduced resizable data sets, which we
will soon use for particle backtransformed diagnostics.
@ax3l ax3l force-pushed the topic-openPMD-0.14.0 branch from 630d7ba to e5930e9 Compare August 18, 2021 04:23
@ax3l ax3l changed the title [WIP] openPMD-api: 0.14.2 openPMD-api: 0.14.2 Aug 18, 2021
@RemiLehe RemiLehe merged commit f6064ec into ECP-WarpX:development Aug 18, 2021
@ax3l ax3l deleted the topic-openPMD-0.14.0 branch August 18, 2021 16:50
roelof-groenewald added a commit to ModernElectron/WarpX that referenced this pull request Aug 25, 2021
* AMReX/PICSAR: Weekly Update (ECP-WarpX#2199)

Weekly update to latest AMReX.
No changes in PICSAR.

* do_pml should not be parsed anymore. (ECP-WarpX#2183)

* do_pml not parsed. remove code that was added to support both types of boundary interface

* add paranthesis

* fix eol and move to BackwardCompatibility

* missing semicolon

* Update Source/WarpX.cpp

* clean input files in examples

* delete do_pml in performance test input

* fixing an example input file in docs

* Update Source/WarpX.cpp

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* cleaning up docs

* Update Docs/source/usage/parameters.rst

* fix eol

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Multi-J (Hybrid): Fix Bug with Current Centering (ECP-WarpX#2181)

* Use less guard cells in ParallelCopy of refined level's UpdateAuxilaryData (ECP-WarpX#2144)

* Use less guard cells in ParallelCopy of refined level's UpdateAuxilaryData
* Update Source/Parallelization/WarpXComm.cpp
* Add inline comments and use explicit type for ng_src

Co-authored-by: Edoardo Zoni <59625522+EZoni@users.noreply.github.com>

* Update Scripts: .ini file (ECP-WarpX#2191)

Update regression tester .ini files as well in weekly/release
updates of AMReX and PICSAR.

* Minor fix to the documentation of the plasma lens (ECP-WarpX#2200)

* CI: Cancel Prev. GH Action on Push (ECP-WarpX#2202)

Save CI resources by canceling already running or waiting builds
if a PR is updated.

This was previously only done for Azure and now also for GH actions.

* openPMD-api: 0.14.2 (ECP-WarpX#2150)

This updates our requirements to openPMD-api 0.14.2+

Among others, this release introduced resizable data sets, which we
will soon use for particle backtransformed diagnostics.

* update use-sensei flag (ECP-WarpX#2192)

* update use-sensei flag
also add FlushFormatSensei.cpp to CMakeLists

* CMake: WarpX SENSEI Option

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Rho Diags: Do Not Allocate `rho_fp/cp`, Remove `plot_rho`, `setplot_rho` (ECP-WarpX#2203)

* Bug fixed: looping over all species now within `OneStep_multiJ` (ECP-WarpX#2207)

* Bug fixed: looping over all species now within 'OneStep_multij'

* Bug fixed: looping over all laser particles within DepositCharge in multi-J

* Implemented the parsing of integer input parameters (ECP-WarpX#2138)

* Implemented the integer parser

* Updated comment

* Updated documentation

* Fixed unused parameters

* Added some additional documentation

* Reworked the implementation so that expressions are evaluated as real and rounded to the nearest integer

* Fixed loop type

* Copied over initial value of variable to the real instance

* Update Source/Utils/WarpXUtil.cpp

make result const

* Update Source/Utils/WarpXUtil.cpp

make result const

* Update Source/Utils/WarpXUtil.cpp

Fix comment

* Added safeCastToInt

* Fixed adding of safeCastToInt

* Cleaned up safe casting routine

* Added parsing of more integer inputs

* Cleaned up the integer parser, removing unneeded cast from int to real

* Made x a const in safeCastToInt

Co-authored-by: Neïl Zaim <49716072+NeilZaim@users.noreply.github.com>
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Ionization.H: Fix Bug (Division by Zero) (ECP-WarpX#2214)

* Add plot_raw_rho in parameters doc (ECP-WarpX#2212)

* add plot_raw_rho in parameters doc

* Update Docs/source/usage/parameters.rst

Co-authored-by: Luca Fedeli <luca.fedeli@for.unipi.it>

Co-authored-by: Luca Fedeli <luca.fedeli@for.unipi.it>

* Docs: Summit with RHEL8 (ECP-WarpX#2216)

Summit has undergone a major software update to RHEL8.
The default compilers and CUDA version have been modernized, among
others providing C++17 support by default.

Also, our scientific I/O stack is now system-provided, thanks to our
Spack and E4S efforts 🎉

Please update your `warpx.profile` on Summit, re-build your Python
virtual environment and re-compile your executables.

* Improved error handling when the libwarpx shared object library can't be loaded (ECP-WarpX#2215)

* Improved error handling when the libwarpx shared library can't be loaded

* Removed extra newline

* Improved check and ended program on error

* Particle boundary scrape (ECP-WarpX#2124)

* Some prelimary refactoring.

* missing header

* implement scraping particles that leave the domain boundaries into buffers

* fix tabs

* missing return

* merging

* remove redefinition

* functor to work around cuda bug.

* handle 2D

* Add support for EB buffer

* protect for AMREX_USE_EB, static_assert that EB and RZ aren't both on.

* fix unused

* add inputs file

* add test

* fix bugs, remove print

* fix test

* fix test path.

* remove no-op code

* adding clear particles method

* attempt at adding time stamp

* Use integer step number instead of physical time to timestamp particles; also put shared code into named functor.

* move call to before apply boundary conditions

* use more descriptive inputs parameter

* Update Source/Particles/ParticleBoundaryBuffer.cpp

* fix comp bug

* move CopyAndTimestamp to cpp file

* also move IsOutsideProblemDomain functor

* Rename to m_particle_boundary_buffer

- Name: currently only used for boundary scraping
- Singular: only one instance

* Fix missing EOF newline

* Typo: author

* Param Read / Init: Cleanup For

Simplify

Co-authored-by: Roelof Groenewald <40245517+roelof-groenewald@users.noreply.github.com>
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Updated the installation instructions (ECP-WarpX#2218)

* Docs: OLCF ADIOS2 Currently Broken

Tickets open:
- OLCFHELP-3319
- ornladios/ADIOS2#2836

* Allow Silver-Mueller boundary conditions to only be applied in certain directions (ECP-WarpX#2220)

* Apply Silver-Mueller boundary only in requested direction

* Include a test with independent Silver-Mueller boundary conditions

* Correct typo in implementation

* Added transform of fields from lab to boosted frame (ECP-WarpX#2201)

* Evolve: Reorder py_afterstep and cur_time break (ECP-WarpX#2213)

* Docs: Cori PICMI Instructions (ECP-WarpX#2219)

* Docs: Cori PICMI Instructions

Too tricky to get right to just cross-link - add documentation how to
build & run PICMI interfaces on Cori.

* Link: Jupyter Instructions

* No particle resorting when no species (ECP-WarpX#2136)

* default sort interval for particles if there are not species

* declare parmparse

Co-authored-by: Edoardo Zoni <59625522+EZoni@users.noreply.github.com>

* fix eol

* define and set sort_interval default to -1 and then reset them to 4 for GPU, -1 for CPU if there are species/lasers in the input

Co-authored-by: Edoardo Zoni <59625522+EZoni@users.noreply.github.com>

* AMReX/PICSAR: Weekly Update (ECP-WarpX#2222)

Weekly update to latest AMReX.
No changes in PICSAR.

* Scalar field interpolator from grid to particle position (ECP-WarpX#2221)

* refactored distance to EB calculation to have a more general function to interpolate from a scalar field on the grid to a given position

* changed RZ error message to general interpolation function

* changed function names to specify that scalar field interpolation is for a nodal field only

* Check for unused WarpX environment variables when compiling (ECP-WarpX#2208)

* Added check for unused 'WarpX' environment variables

* Changed the approach of this

- Go through environment variables as they are used to set cmake flags, and then
  check any remaining if they start with warpx.

* Improved the warning message

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Ascent/SENSEI: Add Profiler (ECP-WarpX#2204)

* Ascent/SENSEI: Add Profiler

Add overall write profilers to Ascent & SENSEI.
Add detailed profile for sub-operaions (blueprint, publish, execute)
to Ascent as well.

* Simplify One-Time Profiler Vars

* Enable restart with PSATD (ECP-WarpX#1367)

* Enable restart with PSATD

* Add new restart test

* Add new input file

* New CI Test: Fix Inputs, Fix Analysis Script

* Reuse input for Restart with FDTD

* Read time_of_last_gal_shift from Checkpoint

* Upload Benchmark for restart_psatd

* Update Benchmark for restart_psatd

Co-authored-by: Edoardo Zoni <ezoni@lbl.gov>

* Use the function `SyncRho` and the array `rho_fp` in the Electrostatic solver (ECP-WarpX#1811)

* Use `SyncRho` in Electrostatic solver

* Reuse rho_fp arrays

* Allocate arrays for rho

* Fix unused variable

* Fix bug in rho deposition

* Only use rho_fp in lab-frame Poisson solver

* Update test cases

* Incorporate PR comments

* Avoid an NaN in collision module (ECP-WarpX#2225)

* Update inputs_3d

* Update inputs_2d

* Update inputs_2d

* Update inputs_3d

* fix nan

* removed check for final timestep diagnostics in CI tests, should revert once installafterdiagnostics is implemented

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
Co-authored-by: Revathi  Jambunathan <41089244+RevathiJambunathan@users.noreply.github.com>
Co-authored-by: Edoardo Zoni <59625522+EZoni@users.noreply.github.com>
Co-authored-by: Neïl Zaim <49716072+NeilZaim@users.noreply.github.com>
Co-authored-by: Remi Lehe <remi.lehe@normalesup.org>
Co-authored-by: Corey Wetterer-Nelson <78513275+c-wetterer-nelson@users.noreply.github.com>
Co-authored-by: Olga Shapoval <30510597+oshapoval@users.noreply.github.com>
Co-authored-by: David Grote <grote1@llnl.gov>
Co-authored-by: Luca Fedeli <luca.fedeli@for.unipi.it>
Co-authored-by: Michael Kieburtz <michaelkieburtz@gmail.com>
Co-authored-by: Andrew Myers <atmyers@lbl.gov>
Co-authored-by: Peter Scherpelz <31747262+peterscherpelz@users.noreply.github.com>
Co-authored-by: Edoardo Zoni <ezoni@lbl.gov>
Co-authored-by: Yinjian Zhao <yinjianzhao@lbl.gov>
ax3l added a commit to ax3l/WarpX that referenced this pull request Nov 12, 2021
Automatically copy and compile openPMD-api 0.14.3, but still
supporting the 0.14.2+ range (ECP-WarpX#2150).

The 0.14.3 release solves ABI incompatibilities in C++14/17 mixed
builds, among other issues (mainly read).
@ax3l ax3l mentioned this pull request Nov 12, 2021
EZoni pushed a commit that referenced this pull request Nov 12, 2021
Automatically copy and compile openPMD-api 0.14.3, but still supporting the 0.14.2+ range (#2150).

The 0.14.3 release solves ABI incompatibilities in C++14/17 mixed builds, among other issues (mainly read).
roelof-groenewald added a commit to ModernElectron/WarpX that referenced this pull request Nov 19, 2021
* Add Python Wrappers for F,G in PML (ECP-WarpX#2481)

* Add Python Wrappers for F,G in PML

* Add Getters for F,G Nodal Flags

* Fix Bug in <F,G>FPPMLWrapper (Default Level)

* Fix Bug in F,G Nodal Flags

* Use GetPML Method for F,G Nodal Flags

* PICMI: Add max_grid_size, blocking_factor in (x,y,z) (ECP-WarpX#2524)

* PICMI: Add amr.max_grid_size_<x,y,z>

* Set All Flags in Python

* PICMI: Add amr.blocking_factor_<x,y,z>

* added wrappers to get particle structs for the particles in the boundary buffers (ECP-WarpX#2498)

* Doxygen: Fix Docs (ECP-WarpX#2526)

* Fix Bug with Tilebox for G in PML (ECP-WarpX#2527)

* Fix Bug with Tilebox for G in PML

* Reset Benchmark

* AMReX/PICSAR: Weekly Update (ECP-WarpX#2533)

Weekly update to latest AMReX.
Weekly update to latest PICSAR (no changes).

```
./Tools/Release/updatePICSAR.py
./Tools/Release/updateAMReX.py
```

* Install pre-commit (ECP-WarpX#2532)

* Add pre-commit

Add basis for automated pre-commit checks.
Install locally via:

```bash
python3 -m pip install -U pre-commit
pre-commit install
```

See: https://pre-commit.com

* Cleanup: Whitespaces

* Cleanup: requirements.txt order

* Summit: Update Numpy Hints (ECP-WarpX#2535)

Make sure `numpy` can be rebuilt when and were needed.
To achieve that, move numpy-specific installation hints on OpenBLAS
to the WarpX profile.

* Fix some issues with Fujitsu compiler (ECP-WarpX#2529)

* make some code compilable with Fujitsu compiler in clang mode

* update documentation

* Fix ECP-WarpX#2522: Gaussian beam positions do not change with warpx.random_seed (ECP-WarpX#2523)

* Draw Gaussian beam position with amrex random engine

* Update benchmarks

* Update tolerance in space-charge tests

* Update benchmark for space charge initialization test

* Update benchmarks

* Update benchmark

* Clean-up code

* Update benchmarks

* ECP-WarpX#2534: Don't access position vector values beyond the configured dimension (ECP-WarpX#2536)

* ECP-WarpX#2534: Don't access position vector values beyond the configured dimension

* Fix particle position component used in XZ configuration

* Handle 1D case

* Move values only used in scraping function into inside-boundary condition

* Error out if scraping from EB in RZ

* Spack Development: macOS & GNUmake (ECP-WarpX#2545)

- Add macOS hints for OpenMP
- Add hints for running GNUmake regression tests locally

* Regression Tests: OMP on (ECP-WarpX#2548)

We generally run only with one OpenMP threads at the moment, but
disabling OpenMP altogether causes an extra compile, which slows down
CI.

* Tests: numthreads to 1 (ECP-WarpX#2546)

* Tests: numthreads to 1

We already hack this option to read `numthreads = 1` already for
benchmarks, thus we remove the confusing other values now.

* Prepare for CI: Do not Overwrite `numthreads`

* std::ifstream: Defensive Patterns (ECP-WarpX#2547)

Add failure handling if inputs in `std::ifstream`s cannot be opened
or have problems seek-ing through them.

This should catch I/O errors early.

* openPMD: 0.14.3 (ECP-WarpX#2551)

Automatically copy and compile openPMD-api 0.14.3, but still supporting the 0.14.2+ range (ECP-WarpX#2150).

The 0.14.3 release solves ABI incompatibilities in C++14/17 mixed builds, among other issues (mainly read).

* Add Ar and Xe to pre-defined particle types. (ECP-WarpX#2549)

* Added Ar and Xe to pre-defined particle types

* Added Boltzmann's constant to warpx parser

* Updated documentation

* 2D EM solver with EB (ECP-WarpX#2401)

* adding the FieldProbe

* adding missing file

* updating makefile

* fixing host-device problem

* Revert "fixing host-device problem"

This reverts commit 801e6fc.

* fixing host-device problem

* making some variables const

* adding a few comments

* Adding the FieldProbe to the documentation

* making the probe mpi-safe

* added field probe to reduced diag test

* added field probe to reduced diag analysis

* using cell-centered fields in probe diag

* removed a few typos

* Interpolating to the point instead oof cell center

* bug fix

* improved a comment

* updated documentation

* Undone an outdated change

* improving some variable names

* improving the box extraction

* making the interpolation order an input parameter

* fix a typo

* setting the field values to zero if the point is not in the domain

* skipping the communication if probe proc is IO prcessor

* Fixed typo in documentation

Co-authored-by: Neïl Zaim <49716072+NeilZaim@users.noreply.github.com>

* Updating an header

* Added a comment on the probe position

* tidying up the analysis script

* fixed a comment

* removing an unused include

* improving the parsing of parameters

* fixing some comments

* making some variables const

* changed some ParticleReal into Real

* using better tags in MPI communication

* Making field probe work in 2D

* making a variable const

* initializing y_probe only in 3D

* tidying up a line which is common to 2D and 3D

* making a variable constexpr

* adding a _rt

* checking that the probe location is in one of the processors

* removing a useless if condition

* Fixing the initialization in 2D

* Avoiding scrape particles in 2D (it segfaults)

* Adding a test for 2D EB

* Fixed the areas initialization

* Initializing to zero some multifabs

* Modified the ECT solver to make it work in 2D

* Modified the cell extensions to make them work in 2D

* Improved 2D cube test

* Added 2D rotated cube test

* Adding the 2d analysis script and CI

* Removed an unused import from the analysis script

* Ignoring some unused variables

* Fixing the number of dimensions in the 2d test

* Added missing analysis for ECT

* Enabled again 2d particles scraping

* Fixing the test_name with the general logic

* Fixing the test_name with the general logic

* Removed some commented code

* Modified several preprocessor directives to check consistency EB-dimension

* Added missing semicolons

* Fixed a preprocessor directivew

* Fix typo: WARPX_DIM_XZ

* Improving some comments

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Adding some more consistency checks

* Adding some more consistency checks

* Fixed a typo

Co-authored-by: Neïl Zaim <49716072+NeilZaim@users.noreply.github.com>
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Fix Instability in PML with PSATD (ECP-WarpX#2558)

* Fix Instability in PML with PSATD

Damping in PML should be applied before the communications between the regular grids and PML, and between PML grids take place, otherwise the ghost cells are filled with lagged information, which results in an instability. Closes ECP-WarpX#2525.

* Update checksum of the pml_psatd_dive_divb_cleaning test

* Bugfix in load balancing routine (ECP-WarpX#2555)

* add remake of phi_fp during load balancing RemakeLevel

* added phi_cp remake to RemakeLevel function

* revert changes from previous commit

* I/O performance hints for Summit (ECP-WarpX#2495)

* Fix conflict with upstream

* Apply suggestions from code review

* Remove space in the end of lines

* Include suggestions from PR review

* Generalize ROMIO Hints in Batch Scripts

* Fix Comment

* Fix Comment

* Remove duplication

* Formatting

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* AMReX/PICSAR: Weekly Update (ECP-WarpX#2559)

* AMReX: Weekly Update

* PICSAR: Weekly Update

* Docs: Fix .rst Label in PML, rm .tex (ECP-WarpX#2537)

Fix an auto-converted label in a `.rst` file for the manual.
Remove the `PML.tex` file.

* Add 2D circle EB test (ECP-WarpX#2538)

* Added embedded_circle test

* Add embedded_circle test files

* Removed diag files

* removed PICMI input file

* Update to use default regression analysis

* Added line breaks for spacing

* Added description

* Fixed benchmark file

* Added load balancing to test

* Commented out load_balancing portion of test.
This will be added back in once load balancing is fixed.

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Fixes to the EB init (ECP-WarpX#2565)

* Avoid code duplications in ECT face extension (ECP-WarpX#2557)

* Refactoring the nborrow functions

* Refactoring the one cell extension

* Refactoring the eight cells extension

* Enabling 2D

* Bug fix

* Some more improvements

* Fixing templates

* Switching the order of templates and AMREX_GPU_DEVICE

* Adding the needed AMREX_GPU_DEVICE in WarpX.H

* Fixing GPU related issues

* Fixed a for loop bound

* Making the new functions free

* Suggestion from review

* Suggestion from review

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Suggestion from review

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Improve loops over dimensions for 2D

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Enhanced inline documentation of EB related data (ECP-WarpX#2562)

* Enhanced inline documentation of EB related data

* Added ECT to the glossary

* Made the EB documentation doxygen-compatible

* Clean up input files for tests with MCC (ECP-WarpX#2552)

* Added embedded_circle test

* Add embedded_circle test files

* Removed diag files

* removed PICMI input file

* Update to use default regression analysis

* Added line breaks for spacing

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Added description

* Added Ar and Xe to pre-defined particle types

* added Boltzmann's constant to pre-defined constants and cleaned up the MCC CI test input

* Added Boltzmann's constant to warpx parser

* cleaned up embedded circle CI test input

* Remove duplicate entry.

Co-authored-by: kzhu-ME <kevin.zhu@modernelectron.com>
Co-authored-by: Kevin Z. Zhu <86268612+KZhu-ME@users.noreply.github.com>
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Cell Center Macroscopic Properties (ECP-WarpX#2530)

* Cell Center Macroscopic Properties

* Commit Suggestions from PR Review

* Fix Error for 2D

Co-authored-by: Edoardo Zoni <59625522+EZoni@users.noreply.github.com>

* added superLU solver example to docs (ECP-WarpX#2567)

* Fix: GNUmake Python Link -g (ECP-WarpX#2568)

On CPU links of the GNUmake Python lib, we forgot our `-g`, which we
add to all build and optimization types. THis is part of the
`LINKFLAGS` variable.

* Apply PEC to Split PML Fields (ECP-WarpX#2541)

* WarpXMovingWindow.cpp: Add `amrex::` Prefix (ECP-WarpX#2579)

* Bug fixes and cleanup in load balancing (ECP-WarpX#2563)

* added helper function to rebuild MultiFabs and iMultiFabs during load balancing and included rebuilding of EB multifabs

* added redistribute call for the particle boundary buffer during load balancing

* consistently use DistribtionMap rather than dmap in ElectrostaticSolver.cpp

* applied suggested changes from code review by Phil Miller

* removed default argument for redistribute in RemakeMultiFab

* removed RemakeMultiFab() as a member of WarpX

* Only remake EB multifabs if they are used

Co-authored-by: Lorenzo Giacomel <47607756+lgiacome@users.noreply.github.com>

* adapted existing particle scraping test (PICMI version) to also cover the redistribution of particle buffers from load balancing

* added redeclaring of m_borrowing

* Move redeclaring of m_borrow inside if statement for ECT solver algorihtm

Co-authored-by: Lorenzo Giacomel <47607756+lgiacome@users.noreply.github.com>

* added calls to MarkCells and ComputeFaceExtensions

* fixed issue causing CI test to fail and copied conditionals from WarpXInitData.cpp to recompute EB quantities

* Guard cells communication for EB data when re-gridding (#105)

* Add 2D circle EB test (ECP-WarpX#2538)

* Added embedded_circle test

* Add embedded_circle test files

* Removed diag files

* removed PICMI input file

* Update to use default regression analysis

* Added line breaks for spacing

* Added description

* Fixed benchmark file

* Added load balancing to test

* Commented out load_balancing portion of test.
This will be added back in once load balancing is fixed.

Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* Added guard cells communication for EB data in regridding

Co-authored-by: Kevin Z. Zhu <86268612+KZhu-ME@users.noreply.github.com>
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>

* moved all EB grid data calculations to a new function InitializeEBGridData() which is now called by both WarpX::InitLevelData and WarpX::RemakeLevel

* Fix typo in doc string.

Co-authored-by: Phil Miller <unmobile+gh@gmail.com>

Co-authored-by: Lorenzo Giacomel <47607756+lgiacome@users.noreply.github.com>
Co-authored-by: Kevin Z. Zhu <86268612+KZhu-ME@users.noreply.github.com>
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
Co-authored-by: Phil Miller <unmobile+gh@gmail.com>

* WarpXComm.cpp: Use Explicit Types, not `auto` (ECP-WarpX#2578)

* WarpXComm.cpp: Use Explicit Types, not `auto`

* Use MultiFab* const instead of MultiFab* const&

* Add WARPX_PROFILE calls to each python callback. (ECP-WarpX#2573)

When python callbacks take some time, this is useful as otherwise many
callbacks are lumped together in WarpX::Evolve::step.

* LaserInjectionFromTXYEFile Test: Use MPI (ECP-WarpX#2577)

* WarpX tests: All MPI

Enable MPI for the one regression test that does not use it.
Still uses one rank there.

This saves a compile per CI run.

* add numprocs

* Bug fix in postprocessing yt data and other small changes (#104)

* fixed bug in post-processing of field diagnostic yt data

* fixed issue causing post processing of field diagnostics unit test to fail

* added helper function to rebuild MultiFabs and iMultiFabs during load balancing and included rebuilding of EB multifabs

* added redistribute call for the particle boundary buffer during load balancing

* consistently use DistribtionMap rather than dmap in ElectrostaticSolver.cpp

* applied suggested changes from code review by Phil Miller

* removed default argument for redistribute in RemakeMultiFab

* added load balance intervals as an optional input parameter to diode_setup.py

* removed RemakeMultiFab() as a member of WarpX

* Only remake EB multifabs if they are used

Co-authored-by: Lorenzo Giacomel <47607756+lgiacome@users.noreply.github.com>

* changed plot_contours grid setting parameter from b to visible in accordance with matplotlib 3.5 changes

* Revert change to WarpXRegrid.cpp

* Suggested change from PS during code review

Co-authored-by: Peter Scherpelz <31747262+peterscherpelz@users.noreply.github.com>

Co-authored-by: Lorenzo Giacomel <47607756+lgiacome@users.noreply.github.com>
Co-authored-by: Peter Scherpelz <31747262+peterscherpelz@users.noreply.github.com>

Co-authored-by: Edoardo Zoni <59625522+EZoni@users.noreply.github.com>
Co-authored-by: Axel Huebl <axel.huebl@plasma.ninja>
Co-authored-by: Luca Fedeli <luca.fedeli@cea.fr>
Co-authored-by: Remi Lehe <remi.lehe@normalesup.org>
Co-authored-by: Phil Miller <phil@intensecomputing.com>
Co-authored-by: Kevin Z. Zhu <86268612+KZhu-ME@users.noreply.github.com>
Co-authored-by: Lorenzo Giacomel <47607756+lgiacome@users.noreply.github.com>
Co-authored-by: Neïl Zaim <49716072+NeilZaim@users.noreply.github.com>
Co-authored-by: Weiqun Zhang <WeiqunZhang@lbl.gov>
Co-authored-by: Jean Luca Bez <jeanlucabez@gmail.com>
Co-authored-by: kzhu-ME <kevin.zhu@modernelectron.com>
Co-authored-by: Revathi  Jambunathan <41089244+RevathiJambunathan@users.noreply.github.com>
Co-authored-by: Phil Miller <unmobile+gh@gmail.com>
Co-authored-by: Peter Scherpelz <31747262+peterscherpelz@users.noreply.github.com>
dpgrote pushed a commit to dpgrote/WarpX that referenced this pull request Nov 29, 2021
Automatically copy and compile openPMD-api 0.14.3, but still supporting the 0.14.2+ range (ECP-WarpX#2150).

The 0.14.3 release solves ABI incompatibilities in C++14/17 mixed builds, among other issues (mainly read).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants