Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for PETSc solvers #659

Closed

Conversation

guyer
Copy link
Member

@guyer guyer commented Jul 3, 2019

Addresses #387

TODO:

  • test in parallel on Py3k
  • revise test matrix
  • document scaling
    This situation is unclear and needs more analysis.
  • verify threads vs. cpus
    This situation is unclear and needs more analysis.
  • disable electrochem tests
  • sort out precision issues
  • nproc > 2
    Failures in
    • Doctest: fipy.meshes.gmshMesh.Gmsh2D
    • Doctest: examples.diffusion.nthOrder.input4thOrder1D
    • Doctest: fipy.meshes.representations.gridRepresentation._GridRepresentation._test
    • Doctest: examples.levelSet.distanceFunction.square (due to skfmm raises exception on empty mesh #684, which is beyond the scope of this PR)
  • revise "Running Under Python 3"
  • update fipy-feedstock
  • update environment.yml/requirements.txt (where are these?!? Ah, in binder/.)

@guyer guyer changed the title Ticket644 add support for PETSc solvers WIP: Ticket644 add support for PETSc solvers Jul 3, 2019
Serial only.
Must stipulate bandwidth in all uses.

Addresses usnistgov#644
There was a lot of redundant code between the specified solver cases and
the guessing solver cases. These treatements have been consolidated and
simplified. Easier to rearrange order of solvers to try and to add in new
solvers.

Addresses usnistgov#644
Old `parallelComm` made heavy assumptions about availability of Trilinos.
Comms (particularly the parallel ones) are intimately tied to the solver
in use, so they should be instantiated with the solver.

No matter what, we need mpi4py, so use it (and only it) to determine number
of processes in order to pick a solver.

Addresses usnistgov#644
Based on trilinos' comms

Addresses usnistgov#644
PETSc seems pickier about matrices with nothing on the diagonal than the
other solvers. This `DummySolver` is an attempt to work around that,
although I'm not clear what the `DummySolver` is attempting to resolve in
the first place.

Addresses usnistgov#644
Not used and no longer importable at this stage.

Addresses usnistgov#644
Matrices and vectors have already been converted by the time we get to
`_solve_()`.

Addresses usnistgov#644
There is no point in passing the iterations and tolerance to
`PETSc.KSP()`. It is ony used to precondition. We need to handle
iterations and tolerance ourselves, just like for the other LU solvers.

Addresses usnistgov#644
Incomplete transition to parallel. Works for some
(`examples/diffusion/mesh1D.py`, `examples/diffusion/mesh20x20.py`), but
not others (`examples/diffusion/circle.py`).

Addresses usnistgov#644
PETSc's LinearLUSolver presently doesn't converge, causing this example to
run forever.

Addresses usnistgov#644
Matrix multiplication did not properly deal with parallel PETSc vectors
or with 32 bit indices
All (expected) tests pass
numerix routines are no longer imported globally. Horrible construction of
chemotaxis tests meant we didn't catch these two.
More instructive display of parallel conditions, including errors
The need for a different `__mul__` for `_PETScMatrix` and
`_PETScMeshMatrix` got lost in the mix. This is an attempt to get the
right differentiation, based loosely on TrilinosMatrix.
Note: PETSc Matrix market export is broken. Patch from
https://bitbucket.org/jeguyer/petsc/branch/fix-matrix-market-export
is required.
PETSc vectors don't support addition with NumPy arrays. Epetra vectors do
support it, but things also seem to work if everything is cast to NumPy
arrays.
Ensure that result of a matrix * matrix multiplication doesn't downcast
@wd15
Copy link
Contributor

wd15 commented Jan 22, 2020

I couldn't find any typos in the text you've written. All the links seemed to work as well.

The only thing is the graph.

I would make clear what the numbers mean in the legend. In the caption write something like "Scaling behaviour of different solver packages with both varying tasks and varying threads. The number of tasks (processes) varies along the x-axis and the number of threads varies via the line style (see legend)."

The bullet points are an explanation for the behavior in the graph, but you don't describe the behavior. For example, I think you should say "Trilinos with two threads is faster than with one thread until the number of tasks reaches 10. We don't fully understand the reason for this" etc. I find it hard to interpret graphs like that without a narrative.

Copy link
Contributor

@wd15 wd15 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fix graph narrative

Copy link
Contributor

@tkphd tkphd left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Still LGTM. Minor edits suggested.

@guyer
Copy link
Member Author

guyer commented Jan 22, 2020

@wd15 & @tkphd: thanks. I've added the clarifications requested (within the constraints of the limitations of plot captions in sphinx)

@wd15
Copy link
Contributor

wd15 commented Jan 22, 2020

Looks good now.

guyer added 13 commits January 24, 2020 09:33
Gmsh 4.0 doesn't partition correctly
Inexplicable error:

    convert: improper image header `/root/project/documentation/_build/latex/.doctrees/images/dae02450438e07c40e1e06ac9a26251fb6896047/project_thin_badge.gif' @ error/gif.c/ReadGIFImage/1028.
    convert: no images defined `/root/project/documentation/_build/latex/.doctrees/images/project_thin_badge.png' @ error/convert.c/ConvertImageCommand/3273.
Gmsh refinement seems to stall out
@guyer
Copy link
Member Author

guyer commented Jan 27, 2020

Superceded by #701

@guyer guyer closed this Jan 27, 2020
@guyer guyer deleted the ticket644-Add_support_for_PETSc_solvers branch April 9, 2021 23:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants