Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when Jacobian is Tridiagonal #67

Closed
matthieugomez opened this issue Jul 3, 2019 · 21 comments
Closed

Error when Jacobian is Tridiagonal #67

matthieugomez opened this issue Jul 3, 2019 · 21 comments

Comments

@matthieugomez
Copy link
Contributor

matthieugomez commented Jul 3, 2019

finite_difference_jacobian! returns an error when the Jacobian is a Tridiagonal matrix (or BandedMatrix/BlockBandedMatrix etc):

using LinearAlgebra, DiffEqDiffTools
y0 = ones(10)
J = Tridiagonal(ones(9), ones(10), ones(9))
DiffEqDiffTools.finite_difference_jacobian!(J,  (dy, y) -> dy, y0)
#> ERROR: UndefVarError: cols_index not defined

(found this when trying to use SparseDiffTools.jl for a Tridiagonal jacobian)

@ChrisRackauckas
Copy link
Member

Interesting. I wonder if there's a generic way to do the decompression. @pkj-m let's discuss this.

@pkj-m
Copy link
Contributor

pkj-m commented Jul 3, 2019

I think it'll be easier to just convert all Tridiagonal (and BandedMatrices) to either a regular matrix using Convert(Array, J) or to a sparse matrix with sparse(J) before processing them further.

Tridiagonal matrices are failing this condition https://github.com/JuliaDiffEq/DiffEqDiffTools.jl/blob/master/src/jacobians.jl#L230 so they get treated as SparseMatrix which leads to the error.

@matthieugomez
Copy link
Contributor Author

I hope that the Jacobian outputed by DiffeqDiffTools can remain Tridiagonal though (even if this particular structure is not used in finite_difference_jacobian!)

@ChrisRackauckas
Copy link
Member

Yes, we just need to put an interface on this. rows_idxs and cols_idxs can be generic functions which we then overload for a bunch of different matrix types so they all act the same.

@pkj-m
Copy link
Contributor

pkj-m commented Jul 3, 2019

Yeah that sounds like a much better idea. It'll completely eliminate any extra conditions that we might have had to put on the input matrix while decompressing and probably keep the code a lot neater as well.

@ChrisRackauckas
Copy link
Member

Think of what those interface functions need to be. We can add them to ArrayInterface.jl which is just a really lightweight "things we are looking to upstream to Base" package, and add the overloads for Base arrays there as well, with non-Base arrays using Requires.jl. Then we can use that package everywhere to introduce the right interface functions while we try to get that particular feature upstreamed to Base.

@huanglangwen
Copy link
Contributor

huanglangwen commented Jul 26, 2019

#71
JuliaArrays/ArrayInterface.jl#9
JuliaDiff/SparseDiffTools.jl#57

@ChrisRackauckas
Copy link
Member

Tridiagonal works now, and other structured matrices will be added by JuliaArrays/ArrayInterface.jl#14

@JuliaRegistrator register()

@JuliaRegistrator
Copy link

Registration pull request created: JuliaRegistries/General/2390

After the above pull request is merged, it is recommended that a tag is created on this repository for the registered package version.

This will be done automatically if Julia TagBot is installed, or can be done manually through the github interface, or via:

git tag -a v1.0.0 -m "<description of version>" e06f5bfbd99f196af1798b0fbd29181923f1eb1e
git push origin v1.0.0

@matthieugomez
Copy link
Contributor Author

matthieugomez commented Jul 31, 2019

That's great new! However I don't think this is working for now:

using LinearAlgebra, SparseDiffTools, DiffEqDiffTools
function f(out, x)
	for i in eachindex(x)
		out[i] = x[i] + x[max(i -1, 1)] + x[min(i+1, length(x))]
	end
	return out
end
x = rand(10)
J = Tridiagonal(ones(length(x)-1), ones(length(x)), ones(length(x)-1))
colors = matrix_colors(J)
DiffEqDiffTools.finite_difference_jacobian!(J, f, x, color=colors)
# ERROR: ArgumentError: cannot set entry (3, 1) off the tridiagonal band to a nonzero value 

@ChrisRackauckas
Copy link
Member

The iterator for Tridiagonal might be off.

@matthieugomez
Copy link
Contributor Author

I think there is also an issue with other types. For instance, with the master version of ArrayInterface:

using LinearAlgebra, SparseDiffTools, DiffEqDiffTools, BandedMatrices
function f(out, x)
    for i in eachindex(x)
        out[i] = x[i] + x[max(i -1, 1)] + x[min(i+1, length(x))]
    end
    return out
end
x = rand(10)
J = BandedMatrix(Zeros(length(x),length(x)), (1,1))
colors = matrix_colors(J)
DiffEqDiffTools.finite_difference_jacobian!(J, f, x, color=colors)

@ChrisRackauckas
Copy link
Member

ChrisRackauckas commented Jul 31, 2019

Actually, you probably just have old versions. We just need to get all of this stuff propagating through the system. You need ArrayInterface 1.1 and DiffEqDiffTools 1.0. Your Tridiagonal example is working for me.

@ChrisRackauckas
Copy link
Member

Okay yeah, I can confirm the MWE

using LinearAlgebra, SparseDiffTools, DiffEqDiffTools, BandedMatrices
function f(out, x)
    for i in eachindex(x)
        out[i] = x[i] + x[max(i -1, 1)] + x[min(i+1, length(x))]
    end
    return out
end
x = rand(10)
J = BandedMatrix(Zeros(length(x),length(x)), (1,1))
colors = matrix_colors(J)
DiffEqDiffTools.finite_difference_jacobian!(J, f, x, color=colors)

@matthieugomez
Copy link
Contributor Author

I am trying to DiffEqDiffTools to master but I have a compatibility requirement issue.

Posting here in case there is a simple solution. On a version of Julia without any Manifest or Project:

add https://github.com/JuliaDiffEq/DiffEqDiffTools.jl#master
add NLsolve
ERROR: Unsatisfiable requirements detected for package NLsolve [2774e3e8]:
 NLsolve [2774e3e8] log:
 ├─possible versions are: [0.1.0-0.1.4, 0.2.0, 0.3.0-0.3.3, 0.4.0, 0.5.0, 0.6.0-0.6.1, 0.7.0-0.7.3, 0.8.0, 0.9.0-0.9.1, 0.10.0-0.10.1, 0.11.0, 0.12.0-0.12.1, 0.13.0, 0.14.0-0.14.1, 1.0.0-1.0.1, 1.1.0-1.1.1, 2.0.0-2.0.1, 2.1.0, 3.0.0-3.0.1, 4.0.0] or uninstalled
 ├─restricted to versions * by an explicit requirement, leaving only versions [0.1.0-0.1.4, 0.2.0, 0.3.0-0.3.3, 0.4.0, 0.5.0, 0.6.0-0.6.1, 0.7.0-0.7.3, 0.8.0, 0.9.0-0.9.1, 0.10.0-0.10.1, 0.11.0, 0.12.0-0.12.1, 0.13.0, 0.14.0-0.14.1, 1.0.0-1.0.1, 1.1.0-1.1.1, 2.0.0-2.0.1, 2.1.0, 3.0.0-3.0.1, 4.0.0]
 ├─restricted by julia compatibility requirements to versions: [2.0.0-2.0.1, 2.1.0, 3.0.0-3.0.1, 4.0.0] or uninstalled, leaving only versions: [2.0.0-2.0.1, 2.1.0, 3.0.0-3.0.1, 4.0.0]
 └─restricted by compatibility requirements with DiffEqDiffTools [01453d9d] to versions: [0.1.0-0.1.4, 0.2.0, 0.3.0-0.3.3, 0.4.0, 0.5.0, 0.6.0-0.6.1, 0.7.0-0.7.3, 0.8.0, 0.9.0-0.9.1, 0.10.0-0.10.1, 0.11.0, 0.12.0-0.12.1, 0.13.0] or uninstalled — no versions left
   └─DiffEqDiffTools [01453d9d] log:
     ├─possible versions are: 1.0.0 or uninstalled
     └─DiffEqDiffTools [01453d9d] is fixed to version 1.0.0

@ChrisRackauckas
Copy link
Member

@matthieugomez
Copy link
Contributor Author

Ok with the latest version of everything, I cannot reproduce any of the MWE. Everything seems to work! Thanks! And sorry for the unnecessary noise.

@matthieugomez
Copy link
Contributor Author

matthieugomez commented Jul 31, 2019

I've tried it on my package, and unfortunately, keeping the BandedBlockBandedMatrix structure tends to slow down things. (the positive side of this is that just using sparse matrices was already super fast!).

In this example, filling a BandedBlockBandedMatrix is 10 times slower than filling a sparse matrix. Is this to be expected?

using LinearAlgebra, SparseArrays, SparseDiffTools, DiffEqDiffTools, BlockBandedMatrices, BenchmarkTools
function f(out, x)
	x = reshape(x, 100, 100)
	out = reshape(out, 100, 100)
	for i in 1:100
		for j in 1:100
			out[i, j] = x[i, j] + x[max(i -1, 1), j] + x[min(i+1, size(x, 1)), j] +  x[i, max(j-1, 1)]  + x[i, min(j+1, size(x, 2))]
		end
	end
	return vec(out)
end
x = rand(10000)
J = BandedBlockBandedMatrix(Ones(10000, 10000), (fill(100, 100), fill(100, 100)), (1, 1), (1, 1))
Jsparse = sparse(J)
colors = matrix_colors(J)
@btime DiffEqDiffTools.finite_difference_jacobian!(J, f, x, color=colors)
@btime DiffEqDiffTools.finite_difference_jacobian!(Jsparse, f, x, color=colors)

@ChrisRackauckas
Copy link
Member

Not expected. Maybe the iterator on the BandedBlockBandedMatrix is slow? This will need some profiling. Open a new issue on this.

@matthieugomez
Copy link
Contributor Author

Where do you want me to open it? ArrayInterfaces or here?

@ChrisRackauckas
Copy link
Member

Let's say DifferentialEquations.jl. Ping me and @huanglangwen, and I'll bring Sheehan into this. Make it an issue about specializing BlockBandedMatrices throughout the entire system. We are almost there, just a tag needed on OrdinaryDiffEq and stuff and it'll be using the techniques everywhere. Once that's done, then we need to profile each of the steps and make sure all of the "fast paths for BlockBandedMatrices" are actually fast (and Sheehan can help us here).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants