Skip to content

Commit

Permalink
Update normalise.jl
Browse files Browse the repository at this point in the history
  • Loading branch information
mcognetta authored Nov 29, 2021
1 parent 185ab40 commit 1242c20
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions src/layers/normalise.jl
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ The [`Dropout`](@ref) layer is what you should use in most scenarios.
function dropout(x, p; dims=:, active::Bool=true)
active || return x
y = rand!(similar(x, _dropout_shape(x, dims)))
@inbounds @. y = x * _dropout_kernel(y, p, 1-p)
@. y = x * _dropout_kernel(y, p, 1-p)
end

@adjoint function dropout(x, p; dims=:, active::Bool=true)
Expand All @@ -56,7 +56,7 @@ e.g. `Dropout(p; dims = 3)` will randomly zero out entire channels on WHCN input
(also called 2D dropout).
Does nothing to the input once [`Flux.testmode!`](@ref) is `true`.
"""`
"""
mutable struct Dropout{F,D}
p::F
dims::D
Expand Down

0 comments on commit 1242c20

Please sign in to comment.