Skip to content

Commit

Permalink
Revert "Better explanation of preparation (#536)"
Browse files Browse the repository at this point in the history
This reverts commit 4d39e77.
  • Loading branch information
gdalle authored Oct 3, 2024
1 parent 4d39e77 commit bfbc719
Show file tree
Hide file tree
Showing 4 changed files with 14 additions and 24 deletions.
11 changes: 3 additions & 8 deletions DifferentiationInterface/docs/src/explanation/backends.md
Original file line number Diff line number Diff line change
Expand Up @@ -137,12 +137,9 @@ For every operator, preparation generates an [executable function](https://brian
### FiniteDiff

Whenever possible, preparation creates a cache object.
Pushforward is implemented rather slowly using a closure.

### FiniteDifferences

Nothing specific to mention.

### ForwardDiff

We implement [`pushforward`](@ref) directly using [`Dual` numbers](https://juliadiff.org/ForwardDiff.jl/stable/dev/how_it_works/), and preparation allocates the necessary space.
Expand All @@ -155,12 +152,9 @@ Most operators fall back on `AutoForwardDiff`.
### ReverseDiff

Wherever possible, preparation records a [tape](https://juliadiff.org/ReverseDiff.jl/dev/api/#The-AbstractTape-API) of the function's execution.
This tape is computed from the arguments `x` and `contexts...` provided at preparation time.
It is control-flow dependent, so only one branch is recorded at each `if` statement.

!!! danger
If your function has value-specific control flow (like `if x[1] > 0` or `if c == 1`), you may get silently wrong results whenever it takes new branches that were not taken during preparation.
You must make sure to run preparation with an input and contexts whose values trigger the correct control flow for future executions.
!!! warning
This tape is specific to the control flow inside the function, and cannot be reused if the control flow is value-dependent (like `if x[1] > 0`).

### Symbolics

Expand All @@ -182,3 +176,4 @@ Same-point preparation runs the forward sweep and returns the pullback closure a

We implement `pullback` based on `Zygote.pullback`.
Same-point preparation runs the forward sweep and returns the pullback closure at `x`.

11 changes: 5 additions & 6 deletions DifferentiationInterface/docs/src/explanation/operators.md
Original file line number Diff line number Diff line change
Expand Up @@ -125,14 +125,13 @@ Here are the general rules that we strive to implement:

For different-point preparation, the output `prep` of `prepare_op(f, b, x, [t])` can be reused in `op(f, prep, b, other_x, [other_t])`, provided that:

- the inputs `x` and `other_x` have the same types and sizes
- the tangents in `t` and `other_t` have the same types and sizes
- the inputs `x` and `other_x` have similar types and equal shapes
- the tangents in `t` and `other_t` have similar types and equal shapes

For same-point preparation, the output `prep` of `prepare_op_same_point(f, b, x, [t])` can be reused in `op(f, prep, b, x, other_t)`, provided that:

- the input `x` remains exactly the same (as well as any [`Constant`](@ref) context)
- the tangents in `t` and `other_t` have the same types and sizes
- the input `x` remains the same (as well as the [`Context`](@ref) constants)
- the tangents in `t` and `other_t` have similar types and equal shapes

!!! warning
These rules hold for the majority of backends, but there are some exceptions.
The most important exception is [ReverseDiff](@ref) and its taping mechanism, which is sensitive to control flow inside the function.
These rules hold for the majority of backends, but there are some exceptions.
Original file line number Diff line number Diff line change
Expand Up @@ -117,8 +117,8 @@ end
function DI.prepare_gradient(
f::F, backend::AutoEnzyme{<:ForwardMode,<:Union{Nothing,Const}}, x
) where {F}
valB = pick_batchsize(backend, length(x))
shadows = create_shadows(valB, x)
B = pick_batchsize(backend, length(x))
shadows = create_shadows(Val(B), x)
return EnzymeForwardGradientPrep{B,typeof(shadows)}(shadows)
end

Expand Down Expand Up @@ -180,8 +180,8 @@ function DI.prepare_jacobian(
f::F, backend::AutoEnzyme{<:Union{ForwardMode,Nothing},<:Union{Nothing,Const}}, x
) where {F}
y = f(x)
valB = pick_batchsize(backend, length(x))
shadows = create_shadows(valB, x)
B = pick_batchsize(backend, length(x))
shadows = create_shadows(Val(B), x)
return EnzymeForwardOneArgJacobianPrep{B,typeof(shadows)}(shadows, length(y))
end

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -349,15 +349,11 @@ end

struct EnzymeReverseOneArgJacobianPrep{Sy,B} <: JacobianPrep end

function EnzymeReverseOneArgJacobianPrep(::Val{Sy}, ::Val{B}) where {Sy,B}
return EnzymeReverseOneArgJacobianPrep{Sy,B}()
end

function DI.prepare_jacobian(f::F, backend::AutoEnzyme{<:ReverseMode,Nothing}, x) where {F}
y = f(x)
Sy = size(y)
valB = pick_batchsize(backend, prod(Sy))
return EnzymeReverseOneArgJacobianPrep(Val(Sy), valB)
B = pick_batchsize(backend, prod(Sy))
return EnzymeReverseOneArgJacobianPrep{Sy,B}()
end

function DI.jacobian(
Expand Down

0 comments on commit bfbc719

Please sign in to comment.