-
Notifications
You must be signed in to change notification settings - Fork 94
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to use ilu preconditioner correctly? #1077
Comments
In case you are using Reference, CUDA or HIP, you could try switching from |
My application background implements a CFD solver that solves the Navier-Stokes equations. And it uses an unstructured grid, so the matrix may be ill-conditioned. Could you please tell me how to use the correct ilu preconditioner? The relevant documentation and examples are not particularly detailed. |
That is true, our documentation needs some expansion, we will focus on that before the next release. It should be sufficient to just replace |
I replaced ParILU with ILU and it still doesn't work. I have a doubt if there is any effect of using the matrix of COO type? I opened "ginkgo/core/factorization/ilu.hpp" and I found that the matrix type is fixed to Csr: template <typename ValueType = gko::default_precision,
typename IndexType = gko::int32>
class Ilu : public Composition<ValueType> {
public:
using value_type = ValueType;
using index_type = IndexType;
using matrix_type = matrix::Csr<ValueType, IndexType>; |
You are using it correctly, the matrix automatically gets converted into Csr internally to set up the preconditioner. Would it be possible to share the matrix? You can run |
Matrices A and b (suffixed with .dat) are provided in the attachment. As you said, my Coo matrix is most likely not sorted/grouped by row index, since it's filled with 5x5 submatrix of blocks. So I also provide the row, column, and value information (A.txt) of the Coo matrix in the attachment for your reference. |
I guess we need to push #770 forward a bit to be able to figure out such issues quickly - passing unsorted data to Coo may cause all kinds of weird things to happen, depending on the executor. Did you ever try running your code with the Reference executor? Does it fail there as well? Reference doesn't have any issues with unordered Coo data, so we can use it for comparison Also, I can't seem to access the uploaded file ( |
The executed is created by gko::ReferenceExecutor::create() so it is exactly the Reference executor. Now let me try to upload the file again. |
Could you elaborate on how the matrix is generated, ie. which CFD method is behind it? Could you try a different solver - preconditioner combination like BiCGStab + Jacobi, just to see if it also produces nans. What I find interesting is that there are off-diagonal elements within the blocks, which are much larger compared to main diagonal entries, however, if that is the cause of the convergence issues is unclear. Additionally, A.dat and A.txt seem to contain different values, or at least different ordering A[0,1] = -0.00314816 in A.txt, vs -0.23123 in A.dat. The value -0.00314816 appears much later A.dat, so could it be that the ordering is broken? Furthermore, looking while the sparsity pattern of A.txt looks ok, the one presented in A.dat looks broken. Can you double-check that the conversion is correct. |
I'll close this issue now, as it seems stale. Feel free to reopen if needed. |
I want to utilize Ginkgo's restarted GMRES with an ilu preconditioner as a solver for sparse linear systems. But there are no detailed instructions provided in the tutorial. I implemented the GMRES+ilu preconditioner with reference to the example, but the result of the solution is nan. Here is my code snippet:
The assembly of the matrix and the right-hand-side vector should be no problem because when switching to the Jacobi preconditioner I get the correct result:
What are the problems with my way of using GMRES with the ilu-preconditioner?
The text was updated successfully, but these errors were encountered: