-
Notifications
You must be signed in to change notification settings - Fork 157
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for PETSc solvers [REBASE] #701
Merged
guyer
merged 19 commits into
usnistgov:master
from
guyer:Add_support_for_PETSc_solvers_REBASE
Jan 28, 2020
Merged
Add support for PETSc solvers [REBASE] #701
guyer
merged 19 commits into
usnistgov:master
from
guyer:Add_support_for_PETSc_solvers_REBASE
Jan 28, 2020
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
- Explicitly install petsc4py `create --only-deps` results in installing `petsc`, but not `petsc4py` - Remove petsc from AppVeyor matrix petsc4py not available from conda-forge - Bump the CircleCI cache - Differentiate Py2k and Py3k caches - Expand parameters - Remove command expansion from cache name - Checksum the correct file - Remove checksum of nonexistent file - Install mpi4py - Add PETSc serial runs - Add PETSc parallel runs - Force mpich - Install libGL for Gmsh? - Drop mkl requirement on CircleCI - Install then remove fipy instead of --deps-only --deps-only doesn't record the build variant specifics of fipy that conda chose, making diagnosing installation a nightmare
Serial only. Must stipulate bandwidth in all uses. - Account for matrix ownership range - Introduce explicit `DummySolver` for PETSc PETSc seems pickier about matrices with nothing on the diagonal than the other solvers. This `DummySolver` is an attempt to work around that, although I'm not clear what the `DummySolver` is attempting to resolve in the first place. - Institute LU iteration scheme used elsewhere There is no point in passing the iterations and tolerance to `PETSc.KSP()`. It is ony used to precondition. We need to handle iterations and tolerance ourselves, just like for the other LU solvers.
There was a lot of redundant code between the specified solver cases and the guessing solver cases. These treatements have been consolidated and simplified. Easier to rearrange order of solvers to try and to add in new solvers. Addresses usnistgov#644
- Old `parallelComm` made heavy assumptions about availability of Trilinos. Comms (particularly the parallel ones) are intimately tied to the solver in use, so they should be instantiated with the solver. - No matter what, we need mpi4py, so use it (and only it) to determine number of processes in order to pick a solver. - Create PETSc comms based on trilinos' comms - Make max/min global Change to less flippant names (and add docstring) - Force `SkewedGrid2D` to have a SerialComm - Improve reporting of solver errors
Incomplete transition to parallel. Works for some (`examples/diffusion/mesh1D.py`, `examples/diffusion/mesh20x20.py`), but not others (`examples/diffusion/circle.py`). Properly account for serial communicators Define `mpi4py_comm` for serial PETSc comms Calculate parallel residual
- Matrix multiplication did not properly deal with parallel PETSc vectors or with 32 bit indices - The need for a different `__mul__` for `_PETScMatrix` and `_PETScMeshMatrix` got lost in the mix. This is an attempt to get the right differentiation, based loosely on TrilinosMatrix. - Fix construction of higher-order b-vectors PETSc vectors don't support addition with NumPy arrays. Epetra vectors do support it, but things also seem to work if everything is cast to NumPy arrays. - Fix type of matrix * matrix multiplication Ensure that result of a matrix * matrix multiplication doesn't downcast - Cast PETSc vectors to NumPy array PETSc vectors don't support all methods that NumPy arrays (and Epetra vectors) do - Use inherited `__mul__` for PETSc vectors, as well as matrices
- More instructive display of parallel conditions, including errors - Pass integers to MaxAll - Make AbstractCommWrapper.MaxAll act like ParallelPETScCommWrapper.MaxAll and EpetraCommWrapper.MaxAll
Note: PETSc Matrix market export is broken. Patch from https://bitbucket.org/jeguyer/petsc/branch/fix-matrix-market-export is required.
- Account for `_CoupledVariables` - Return residual as calculated, as a PETSc (ghosted) Vec - Use correct communicator Reflexively using WORLD_COMM causes all sorts of problems - Convert between PETSc and FiPy vectors with ghosts Account for coupled/vector Variables and moving ghosts to end of PETSc vectors - Skip reshape and insertion of empty ghost array NumPy doesn't (readily, anyway) let us reshape [] to [[]] - Adjust shape for both vector and coupled - Differentiate between ghost positions in FiPy Variables and in PETSc matrix - Change generic multiplication to account for rows, but not ghosts - Make multiplication result ghosted - Flatten PETSc vector and let recipient decide if it's coupled - Adjust size of ghost vector - `createGhostWithArray()` requires an array of size nlocal+nghost - `createGhost()` requires suppyling only nlocal as the size - Account for ghosting of residual vector - Use `array` instead of `asarray` `asarray` created a view that disappeared when the `localForm()` went out of context.
…m end of PETSc vector - Remove reshape for rank-0 vectors `numerix.reshape([], (2, -1))` works, but `numerix.reshape([], (1, -1))` doesn't. Go figure. - Convert to dense NumPy array more effectively Convert via Matrix Market file, just like Trilinos. `getValuesCSR()` only returns local values `getDenseArray()` fails with a "Cannot locate function MatDenseGetArray_C in object" - Account for no application ordering on PETScMatrix - Take parallel diagonal using existing method Account for diagonal ghost cells
- `PETSc.Viewer().createASCII()` no longer understands `PETSc.Viewer.Format.ASCII_MATRIXMARKET`. New code is closer to `PetscBinaryIO.py`, which is not on the `PYTHONPATH`. - Add `exportMmf` for SciPy, which it never had for some reason - Consolidate CSR values from all nodes
- Fix Py3k syntax errors - futurize --stage2 --write --nobackups fipy There were no --stage1 fixes - 2to3 --doctests_only --write --nobackups fipy - Fix print statements
Both clearer what it is and consistent with other meshes
- Move solver-specific discussions to documentation/SOLVERS.rst. - Document PETSc solvers - Document PETSc preconditioners - Add note that PyTrilinos isn't available on Py3k (from conda-forge) - Document debugging in parallel - Change to "Running under Python 2" - Add usage details for PETSc, parallel, and Py2k - Update discussion of parallel scaling and threading - Add glossary entries for commonly-used external software. - Purge matforge - Add spelling words
PETSc doesn't converge on nproc > 2 without better preconditioning. 'cholesky', 'asm', and 'svd' work, too. Haven't tried them all.
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Addresses #387
Supercedes #659, rebased to clean up commit history