Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Linear Solvers (for large sparse systems) #648

Closed
pcarruscag opened this issue Feb 5, 2019 · 5 comments
Closed

Linear Solvers (for large sparse systems) #648

pcarruscag opened this issue Feb 5, 2019 · 5 comments

Comments

@pcarruscag
Copy link
Member

The discussion in #643 revealed that this (the performance of the linear solvers) is a concern for a lot of us.
I am opening this issue so we can discuss each issue separately, I will quote relevant comments from #643 below.

@pcarruscag
Copy link
Member Author

@bmunguia
@EduardoMolina and I have discussed this over the past few weeks and are also in favor of using an external library. I don't have a strong opinion on the library we choose, but he seems to be in favor of PETSc from ANL, which has a 2-clause BSD license and is used by ADflow (formerly SUmb), among other solvers. Eduardo could probably provide more details.

Another one that's come up in our discussions is HYPRE from LLNL which has a GNU LGPL.

@juanjosealonso
(...) While PETSc is a wonderful library (and parallel), I would hesitate to use it as the solution for the problem that we are trying to solve: it is not the easiest thing to compile and it is most definitely not lightweight. If one also wanted to replace Krylov-space solvers and preconditioners in SU2 the PETSc might make more sense….but it still forces the developer to conform to their view of the world (including matrix setup and decomposition). (...)

@erangit
I also support external libraries usage (no need to repeat the many advantages as it is well described above) but I think we should be very wary of portability issues. For instance in SUMB, PETSc was used for the Krylov solvers and more. While indeed it worked well and in parallel mode, each new implementation was a nightmare. LAPACK/BLAS package, on the other hand, provides a much easier implementation experience. Certainly, this is not the only consideration but it should be taken into account. Currently, resulting from the significant contributions of the members of this developers group, SU2 implementation works like a charm. I think we should strive to conserve this feature, especially if we aim at attracting more users and developers into the community. (...)

@vdweide
(...) @bmunguia and @EduardoMolina, what type of application did you have in mind for PETSc? The only thing I can think of is a full Newton solver. And no matter how much I like PETSc, @juanjosealonso and @erangit have a point here. Looks like I start to belong to the group of old conservatives as well....

@economon
(...) If you really would like to give PETSc a shot, I recommend talking with @anilvar who had an interface for connecting it to SU2 in one of our branches.

@pcarruscag
(...) being able to use PETSc or HYPRE would be interesting as it would give us access to AMG, and @talbring 's branch feature_template_linear_solver would make such an integration compatible with AD. (...)

@EduardoMolina
(...) When Brian (@bmunguia ) and I mentioned PETSc, it was an idea to try a different Newton-Krylov (with preconditioner) library in order to improve the convergence of SU2.
Since the slow convergence of the SU2-FV is the main feedback that I received from other users from industry and academia, I think it worth try an external library and evaluate the performance. (...)

@pcarruscag
(...) That is something I am also interested in as for some of my structural cases the current linear solvers simply do not converge. (...)

@economon
I would add one practical comment for consideration: it is worth checking whether the main restriction we have is related to approximations in the Jacobian that limit the effective CFL we can use or whether the convergence of the linear solver itself is a problem (speed or complete lack of convergence). A quick test without resorting to another library is to increase the fill-in for ILU-preconditioned GMRES, which is very expensive/slow but should converge difficult problems, and to check how high we can take the CFL when allowing each nonlinear iteration to converge to a tight tolerance in the linear solver, say 1e-14 (you can output the linear solver residuals to verify convergence). If we can take the CFL higher with a more performant linear solver, then it could be worth the effort to try other options.
If the CFL must remain low for stability, then perhaps we should look at the quality of the Jacobians we construct to see if we can improve, or even try exact Jacobians with AD if we can afford it. A more advanced CFL ramping strategy could also be helpful here to get us closer to a solution before trying to aggressively converge.

I think that is everyone.

@aeroamit
Copy link
Contributor

aeroamit commented Feb 6, 2019

Following this discussion, I want to share certain points related to convergence -

1- Some good/fast solution initialisation methods will be helpful to start with (commercial solver like Fluent uses Full Multigrid initialisation /FMG - euler initialisation, which provides fast initial guess to start with)

2- Switching from first order to second order gradually will be helpful

3- Smart/tuned CFL ramping strategy (some commercial codes have tuned way of doing it where CFL range varies from subsonic to hypersonic Mach number)

4- As Dr. Economon mentioned exact Jacobians play important role (In SU2 code, HLLC, JST and Roe have exact/nearly exact Jacobians). I observed that with inconsistent discretization (with some of the problems), solution does not go with higher CFL even in later stages and takes more iterations to converge

5- It will be desirable to arrive at fewer set/combination of Linear solver+preconditioner which covers broad range of problems. That will help in faster evolution of solution convergence strategies

6- Handling poor quality cells in some way may be important from practical usage point of view with realistic geometries (I don’t know how much literature is available for such stuff)

Regards
Amit

@vdweide
Copy link
Contributor

vdweide commented Feb 6, 2019

There is one thing that is not mentioned yet, which is rather important. I assume we are talking RANS here. In that case it does not make sense to ramp up the CFL number to values higher than 50 or so with the current segregated setup of mean flow, turbulence and possibly transition solvers (not even talking about multidisciplinary problems). The segregated character makes it extremely flexible for adding additional models, but it does not work for a full blown Newton solver. In that case you have to switch to a strong coupling between the mean flow and the turbulence solver, which will require a significant change in the data-structures.

Also, computing the exact Jacobians for a second order scheme is not trivial and as Tom mentioned, it may be needed to use AD tools for that, which makes it quite costly. Furthermore the memory usage for storing the exact Jacobians is very big.

An alternative would be to use a matrix free approach, i.e. use a Frechet derivative for the matrix vector products in the Krylov solver (although you still need a good preconditioner). When a coupled with a turbulence model (especially k-omega type models), this will be extremely sensitive to the epsilon parameter you have to choose. There are ways around this, e.g. using dual numbers and possibly with CoDiPack, but these will increase your computational cost by at least a factor of 4.

I think it is definitely worth trying, but seen all the pitfalls you may run into, you may want to test things out first with a test solver before implementing it in SU2 itself.

My two cents,

Edwin

@jayantmukho
Copy link
Contributor

@aeroamit
To your first point about solution initialization, @bmunguia and I are working on a solution interpolation scheme that would allow for euler initialization, and interpolation between different grids (assuming they occupy the similar domains). Hoping to have it ready for SU2 7.0

@pcarruscag
Copy link
Member Author

This can continue in #711

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants