-
Notifications
You must be signed in to change notification settings - Fork 38
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error when Jacobian is Tridiagonal #67
Comments
Interesting. I wonder if there's a generic way to do the decompression. @pkj-m let's discuss this. |
I think it'll be easier to just convert all Tridiagonal (and BandedMatrices) to either a regular matrix using Tridiagonal matrices are failing this condition https://github.com/JuliaDiffEq/DiffEqDiffTools.jl/blob/master/src/jacobians.jl#L230 so they get treated as SparseMatrix which leads to the error. |
I hope that the Jacobian outputed by |
Yes, we just need to put an interface on this. |
Yeah that sounds like a much better idea. It'll completely eliminate any extra conditions that we might have had to put on the input matrix while decompressing and probably keep the code a lot neater as well. |
Think of what those interface functions need to be. We can add them to ArrayInterface.jl which is just a really lightweight "things we are looking to upstream to Base" package, and add the overloads for Base arrays there as well, with non-Base arrays using Requires.jl. Then we can use that package everywhere to introduce the right interface functions while we try to get that particular feature upstreamed to Base. |
Tridiagonal works now, and other structured matrices will be added by JuliaArrays/ArrayInterface.jl#14 @JuliaRegistrator register() |
Registration pull request created: JuliaRegistries/General/2390 After the above pull request is merged, it is recommended that a tag is created on this repository for the registered package version. This will be done automatically if Julia TagBot is installed, or can be done manually through the github interface, or via:
|
That's great new! However I don't think this is working for now: using LinearAlgebra, SparseDiffTools, DiffEqDiffTools
function f(out, x)
for i in eachindex(x)
out[i] = x[i] + x[max(i -1, 1)] + x[min(i+1, length(x))]
end
return out
end
x = rand(10)
J = Tridiagonal(ones(length(x)-1), ones(length(x)), ones(length(x)-1))
colors = matrix_colors(J)
DiffEqDiffTools.finite_difference_jacobian!(J, f, x, color=colors)
# ERROR: ArgumentError: cannot set entry (3, 1) off the tridiagonal band to a nonzero value |
The iterator for Tridiagonal might be off. |
I think there is also an issue with other types. For instance, with the master version of ArrayInterface: using LinearAlgebra, SparseDiffTools, DiffEqDiffTools, BandedMatrices
function f(out, x)
for i in eachindex(x)
out[i] = x[i] + x[max(i -1, 1)] + x[min(i+1, length(x))]
end
return out
end
x = rand(10)
J = BandedMatrix(Zeros(length(x),length(x)), (1,1))
colors = matrix_colors(J)
DiffEqDiffTools.finite_difference_jacobian!(J, f, x, color=colors) |
Actually, you probably just have old versions. We just need to get all of this stuff propagating through the system. You need ArrayInterface 1.1 and DiffEqDiffTools 1.0. Your Tridiagonal example is working for me. |
Okay yeah, I can confirm the MWE using LinearAlgebra, SparseDiffTools, DiffEqDiffTools, BandedMatrices
function f(out, x)
for i in eachindex(x)
out[i] = x[i] + x[max(i -1, 1)] + x[min(i+1, length(x))]
end
return out
end
x = rand(10)
J = BandedMatrix(Zeros(length(x),length(x)), (1,1))
colors = matrix_colors(J)
DiffEqDiffTools.finite_difference_jacobian!(J, f, x, color=colors) |
I am trying to DiffEqDiffTools to master but I have a compatibility requirement issue. Posting here in case there is a simple solution. On a version of Julia without any Manifest or Project: add https://github.com/JuliaDiffEq/DiffEqDiffTools.jl#master
add NLsolve
ERROR: Unsatisfiable requirements detected for package NLsolve [2774e3e8]:
NLsolve [2774e3e8] log:
├─possible versions are: [0.1.0-0.1.4, 0.2.0, 0.3.0-0.3.3, 0.4.0, 0.5.0, 0.6.0-0.6.1, 0.7.0-0.7.3, 0.8.0, 0.9.0-0.9.1, 0.10.0-0.10.1, 0.11.0, 0.12.0-0.12.1, 0.13.0, 0.14.0-0.14.1, 1.0.0-1.0.1, 1.1.0-1.1.1, 2.0.0-2.0.1, 2.1.0, 3.0.0-3.0.1, 4.0.0] or uninstalled
├─restricted to versions * by an explicit requirement, leaving only versions [0.1.0-0.1.4, 0.2.0, 0.3.0-0.3.3, 0.4.0, 0.5.0, 0.6.0-0.6.1, 0.7.0-0.7.3, 0.8.0, 0.9.0-0.9.1, 0.10.0-0.10.1, 0.11.0, 0.12.0-0.12.1, 0.13.0, 0.14.0-0.14.1, 1.0.0-1.0.1, 1.1.0-1.1.1, 2.0.0-2.0.1, 2.1.0, 3.0.0-3.0.1, 4.0.0]
├─restricted by julia compatibility requirements to versions: [2.0.0-2.0.1, 2.1.0, 3.0.0-3.0.1, 4.0.0] or uninstalled, leaving only versions: [2.0.0-2.0.1, 2.1.0, 3.0.0-3.0.1, 4.0.0]
└─restricted by compatibility requirements with DiffEqDiffTools [01453d9d] to versions: [0.1.0-0.1.4, 0.2.0, 0.3.0-0.3.3, 0.4.0, 0.5.0, 0.6.0-0.6.1, 0.7.0-0.7.3, 0.8.0, 0.9.0-0.9.1, 0.10.0-0.10.1, 0.11.0, 0.12.0-0.12.1, 0.13.0] or uninstalled — no versions left
└─DiffEqDiffTools [01453d9d] log:
├─possible versions are: 1.0.0 or uninstalled
└─DiffEqDiffTools [01453d9d] is fixed to version 1.0.0 |
Ok with the latest version of everything, I cannot reproduce any of the MWE. Everything seems to work! Thanks! And sorry for the unnecessary noise. |
I've tried it on my package, and unfortunately, keeping the BandedBlockBandedMatrix structure tends to slow down things. (the positive side of this is that just using sparse matrices was already super fast!). In this example, filling a BandedBlockBandedMatrix is 10 times slower than filling a sparse matrix. Is this to be expected? using LinearAlgebra, SparseArrays, SparseDiffTools, DiffEqDiffTools, BlockBandedMatrices, BenchmarkTools
function f(out, x)
x = reshape(x, 100, 100)
out = reshape(out, 100, 100)
for i in 1:100
for j in 1:100
out[i, j] = x[i, j] + x[max(i -1, 1), j] + x[min(i+1, size(x, 1)), j] + x[i, max(j-1, 1)] + x[i, min(j+1, size(x, 2))]
end
end
return vec(out)
end
x = rand(10000)
J = BandedBlockBandedMatrix(Ones(10000, 10000), (fill(100, 100), fill(100, 100)), (1, 1), (1, 1))
Jsparse = sparse(J)
colors = matrix_colors(J)
@btime DiffEqDiffTools.finite_difference_jacobian!(J, f, x, color=colors)
@btime DiffEqDiffTools.finite_difference_jacobian!(Jsparse, f, x, color=colors) |
Not expected. Maybe the iterator on the BandedBlockBandedMatrix is slow? This will need some profiling. Open a new issue on this. |
Where do you want me to open it? ArrayInterfaces or here? |
Let's say DifferentialEquations.jl. Ping me and @huanglangwen, and I'll bring Sheehan into this. Make it an issue about specializing BlockBandedMatrices throughout the entire system. We are almost there, just a tag needed on OrdinaryDiffEq and stuff and it'll be using the techniques everywhere. Once that's done, then we need to profile each of the steps and make sure all of the "fast paths for BlockBandedMatrices" are actually fast (and Sheehan can help us here). |
finite_difference_jacobian!
returns an error when the Jacobian is a Tridiagonal matrix (or BandedMatrix/BlockBandedMatrix etc):(found this when trying to use
SparseDiffTools.jl
for a Tridiagonal jacobian)The text was updated successfully, but these errors were encountered: