Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

gmshMesh.py compatibility with Gmsh > 3.0.6 #644

Merged
merged 1 commit into from
Apr 23, 2019
Merged

Conversation

xfong
Copy link
Contributor

@xfong xfong commented Apr 23, 2019

Update gmshMesh.py for compatibility with Gmsh version > 3.0.6

Update gmshMesh.py for compatibility with Gmsh version > 3.0.6
@guyer guyer self-requested a review April 23, 2019 18:00
@guyer guyer added the meshes label Apr 23, 2019
Copy link
Member

@guyer guyer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! I was just starting to diagnose this.

@guyer
Copy link
Member

guyer commented Apr 23, 2019

CircleCI didn't run for some reason. I'm merging anyway.

@guyer guyer merged commit 4bf61f9 into usnistgov:develop Apr 23, 2019
guyer added a commit to guyer/fipy that referenced this pull request Jul 3, 2019
Serial only.
Must stipulate bandwidth in all uses.

Addresses usnistgov#644
guyer added a commit to guyer/fipy that referenced this pull request Jul 3, 2019
guyer added a commit to guyer/fipy that referenced this pull request Jul 3, 2019
There was a lot of redundant code between the specified solver cases and
the guessing solver cases. These treatements have been consolidated and
simplified. Easier to rearrange order of solvers to try and to add in new
solvers.

Addresses usnistgov#644
guyer added a commit to guyer/fipy that referenced this pull request Jul 3, 2019
Old `parallelComm` made heavy assumptions about availability of Trilinos.
Comms (particularly the parallel ones) are intimately tied to the solver
in use, so they should be instantiated with the solver.

No matter what, we need mpi4py, so use it (and only it) to determine number
of processes in order to pick a solver.

Addresses usnistgov#644
guyer added a commit to guyer/fipy that referenced this pull request Jul 3, 2019
Based on trilinos' comms

Addresses usnistgov#644
guyer added a commit to guyer/fipy that referenced this pull request Jul 3, 2019
guyer added a commit to guyer/fipy that referenced this pull request Jul 3, 2019
PETSc seems pickier about matrices with nothing on the diagonal than the
other solvers. This `DummySolver` is an attempt to work around that,
although I'm not clear what the `DummySolver` is attempting to resolve in
the first place.

Addresses usnistgov#644
guyer added a commit to guyer/fipy that referenced this pull request Jul 3, 2019
guyer added a commit to guyer/fipy that referenced this pull request Jul 3, 2019
Not used and no longer importable at this stage.

Addresses usnistgov#644
guyer added a commit to guyer/fipy that referenced this pull request Jul 3, 2019
Matrices and vectors have already been converted by the time we get to
`_solve_()`.

Addresses usnistgov#644
guyer added a commit to guyer/fipy that referenced this pull request Jul 3, 2019
There is no point in passing the iterations and tolerance to
`PETSc.KSP()`. It is ony used to precondition. We need to handle
iterations and tolerance ourselves, just like for the other LU solvers.

Addresses usnistgov#644
guyer added a commit to guyer/fipy that referenced this pull request Jul 3, 2019
Incomplete transition to parallel. Works for some
(`examples/diffusion/mesh1D.py`, `examples/diffusion/mesh20x20.py`), but
not others (`examples/diffusion/circle.py`).

Addresses usnistgov#644
guyer added a commit to guyer/fipy that referenced this pull request Jul 3, 2019
guyer added a commit to guyer/fipy that referenced this pull request Jul 3, 2019
PETSc's LinearLUSolver presently doesn't converge, causing this example to
run forever.

Addresses usnistgov#644
guyer added a commit to guyer/fipy that referenced this pull request Dec 16, 2019
Serial only.
Must stipulate bandwidth in all uses.

Addresses usnistgov#644
guyer added a commit to guyer/fipy that referenced this pull request Dec 16, 2019
guyer added a commit to guyer/fipy that referenced this pull request Dec 16, 2019
There was a lot of redundant code between the specified solver cases and
the guessing solver cases. These treatements have been consolidated and
simplified. Easier to rearrange order of solvers to try and to add in new
solvers.

Addresses usnistgov#644
guyer added a commit to guyer/fipy that referenced this pull request Dec 16, 2019
Old `parallelComm` made heavy assumptions about availability of Trilinos.
Comms (particularly the parallel ones) are intimately tied to the solver
in use, so they should be instantiated with the solver.

No matter what, we need mpi4py, so use it (and only it) to determine number
of processes in order to pick a solver.

Addresses usnistgov#644
guyer added a commit to guyer/fipy that referenced this pull request Dec 16, 2019
Based on trilinos' comms

Addresses usnistgov#644
guyer added a commit to guyer/fipy that referenced this pull request Dec 16, 2019
guyer added a commit to guyer/fipy that referenced this pull request Dec 16, 2019
PETSc seems pickier about matrices with nothing on the diagonal than the
other solvers. This `DummySolver` is an attempt to work around that,
although I'm not clear what the `DummySolver` is attempting to resolve in
the first place.

Addresses usnistgov#644
guyer added a commit to guyer/fipy that referenced this pull request Dec 16, 2019
guyer added a commit to guyer/fipy that referenced this pull request Dec 16, 2019
Not used and no longer importable at this stage.

Addresses usnistgov#644
guyer added a commit to guyer/fipy that referenced this pull request Dec 16, 2019
Matrices and vectors have already been converted by the time we get to
`_solve_()`.

Addresses usnistgov#644
guyer added a commit to guyer/fipy that referenced this pull request Dec 16, 2019
There is no point in passing the iterations and tolerance to
`PETSc.KSP()`. It is ony used to precondition. We need to handle
iterations and tolerance ourselves, just like for the other LU solvers.

Addresses usnistgov#644
guyer added a commit to guyer/fipy that referenced this pull request Dec 16, 2019
Incomplete transition to parallel. Works for some
(`examples/diffusion/mesh1D.py`, `examples/diffusion/mesh20x20.py`), but
not others (`examples/diffusion/circle.py`).

Addresses usnistgov#644
guyer added a commit to guyer/fipy that referenced this pull request Dec 16, 2019
guyer added a commit to guyer/fipy that referenced this pull request Dec 16, 2019
PETSc's LinearLUSolver presently doesn't converge, causing this example to
run forever.

Addresses usnistgov#644
guyer added a commit to guyer/fipy-feedstock that referenced this pull request Jan 2, 2020
Thanks to usnistgov/fipy#644, newer versions of Gmsh should be fine
guyer added a commit to conda-forge/fipy-feedstock that referenced this pull request Jan 2, 2020
* Restore gmsh on all platforms

Thanks to usnistgov/fipy#644, newer versions of Gmsh should be fine

* Bump build number

* Drop gmsh on Win2k
guyer added a commit to guyer/fipy that referenced this pull request Jan 27, 2020
There was a lot of redundant code between the specified solver cases and
the guessing solver cases. These treatements have been consolidated and
simplified. Easier to rearrange order of solvers to try and to add in new
solvers.

Addresses usnistgov#644
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants