Skip to content

Commit

Permalink
Update documentation build to Julia 1.11.
Browse files Browse the repository at this point in the history
  • Loading branch information
fredrikekre committed Nov 26, 2024
1 parent 258540f commit b6ae45c
Show file tree
Hide file tree
Showing 14 changed files with 345 additions and 233 deletions.
228 changes: 165 additions & 63 deletions docs/Manifest.toml

Large diffs are not rendered by default.

2 changes: 2 additions & 0 deletions docs/Project.toml
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
[deps]
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
ForwardDiff = "f6369f11-7733-5829-9624-2563aa707210"
StaticArrays = "90137ffa-7385-5640-81b9-e52037218182"
Tensors = "48a634ad-e948-5137-8d70-aa71f2a747f4"
12 changes: 6 additions & 6 deletions docs/src/demos.md
Original file line number Diff line number Diff line change
Expand Up @@ -129,13 +129,13 @@ julia> C = tdot(F);
julia> S_AD = 2 * gradient(C -> Ψ(C, μ, Kb), C)
3×3 SymmetricTensor{2, 3, Float64, 6}:
4.30534e11 -2.30282e11 -8.52861e10
-2.30282e11 4.38793e11 -2.64481e11
-8.52861e10 -2.64481e11 7.85515e11
2.36415e11 -1.37206e11 -3.15432e10
-1.37206e11 2.18256e11 -6.70562e10
-3.15432e10 -6.70562e10 1.06713e11
julia> S(C, μ, Kb)
3×3 SymmetricTensor{2, 3, Float64, 6}:
4.30534e11 -2.30282e11 -8.52861e10
-2.30282e11 4.38793e11 -2.64481e11
-8.52861e10 -2.64481e11 7.85515e11
2.36415e11 -1.37206e11 -3.15432e10
-1.37206e11 2.18256e11 -6.70562e10
-3.15432e10 -6.70562e10 1.06713e11
```
16 changes: 8 additions & 8 deletions docs/src/man/automatic_differentiation.md
Original file line number Diff line number Diff line change
Expand Up @@ -45,13 +45,13 @@ julia> x = rand(Vec{2});
julia> gradient(norm, x)
2-element Vec{2, Float64}:
0.6103600560550116
0.7921241076829584
0.5105128363207563
0.859870132026771
julia> x / norm(x)
2-element Vec{2, Float64}:
0.6103600560550116
0.7921241076829584
0.5105128363207563
0.8598701320267711
```

### Determinant of a second order symmetric tensor
Expand All @@ -65,13 +65,13 @@ julia> A = rand(SymmetricTensor{2,2});
julia> gradient(det, A)
2×2 SymmetricTensor{2, 2, Float64, 3}:
0.566237 -0.766797
-0.766797 0.590845
0.218587 -0.549051
-0.549051 0.325977
julia> inv(A)' * det(A)
2×2 SymmetricTensor{2, 2, Float64, 3}:
0.566237 -0.766797
-0.766797 0.590845
0.218587 -0.549051
-0.549051 0.325977
```

### Hessian of a quadratic potential
Expand Down
30 changes: 15 additions & 15 deletions docs/src/man/constructing_tensors.md
Original file line number Diff line number Diff line change
Expand Up @@ -91,19 +91,19 @@ A tensor with random numbers is created using the function `rand`, applied to th
```jldoctest
julia> rand(Tensor{2, 3})
3×3 Tensor{2, 3, Float64, 9}:
0.590845 0.460085 0.200586
0.766797 0.794026 0.298614
0.566237 0.854147 0.246837
0.325977 0.894245 0.953125
0.549051 0.353112 0.795547
0.218587 0.394255 0.49425
```

By specifying the type, `T`, a tensor of different type can be obtained:

```jldoctest
julia> rand(SymmetricTensor{2,3,Float32})
3×3 SymmetricTensor{2, 3, Float32, 6}:
0.0107703 0.305865 0.2082
0.305865 0.405684 0.257278
0.2082 0.257278 0.958491
0.325977 0.549051 0.218587
0.549051 0.894245 0.353112
0.218587 0.353112 0.394255
```

## [Identity tensors](@id identity_tensors)
Expand Down Expand Up @@ -202,23 +202,23 @@ following code
```jldoctest fromarray
julia> data = rand(2, 5)
2×5 Matrix{Float64}:
0.590845 0.566237 0.794026 0.200586 0.246837
0.766797 0.460085 0.854147 0.298614 0.579672
0.579862 0.972136 0.520355 0.839622 0.131026
0.411294 0.0149088 0.639562 0.967143 0.946453
julia> tensor_data = reinterpret(Vec{2, Float64}, vec(data))
5-element reinterpret(Vec{2, Float64}, ::Vector{Float64}):
[0.5908446386657102, 0.7667970365022592]
[0.5662374165061859, 0.4600853424625171]
[0.7940257103317943, 0.8541465903790502]
[0.20058603493384108, 0.2986142783434118]
[0.24683718661000897, 0.5796722333690416]
[0.5798621201341324, 0.4112941179498505]
[0.9721360824554687, 0.014908849285099945]
[0.520354993723718, 0.6395615996802734]
[0.8396219340580711, 0.967142768915383]
[0.13102565622085904, 0.9464532262313834]
```

The data can also be reinterpreted back to a Julia `Array`

```jldoctest fromarray
julia> data = reshape(reinterpret(Float64, tensor_data), (2, 5))
2×5 Matrix{Float64}:
0.590845 0.566237 0.794026 0.200586 0.246837
0.766797 0.460085 0.854147 0.298614 0.579672
0.579862 0.972136 0.520355 0.839622 0.131026
0.411294 0.0149088 0.639562 0.967143 0.946453
```
8 changes: 4 additions & 4 deletions docs/src/man/indexing.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,12 +14,12 @@ Indexing into a `(Symmetric)Tensor{dim, order}` is performed like for an `Array`
julia> A = rand(Tensor{2, 2});
julia> A[1, 2]
0.5662374165061859
0.21858665481883066
julia> B = rand(SymmetricTensor{4, 2});
julia> B[1, 2, 1, 2]
0.24683718661000897
0.4942498668904206
```

Slicing will produce a `Tensor` of lower order.
Expand All @@ -29,8 +29,8 @@ julia> A = rand(Tensor{2, 2});
julia> A[:, 1]
2-element Vec{2, Float64}:
0.5908446386657102
0.7667970365022592
0.32597672886359486
0.5490511363155669
```

Since `Tensor`s are immutable there is no `setindex!` function defined on them. Instead, use the functionality to create tensors from functions as described [here](@ref function_index). As an example, this sets the `[1,2]` index on a tensor to one and the rest to zero:
Expand Down
14 changes: 10 additions & 4 deletions docs/src/man/other_operators.md
Original file line number Diff line number Diff line change
Expand Up @@ -259,21 +259,25 @@ illustrated by the following example, this will give the correct result. In gene
however, direct differentiation of `Tensor`s is faster (see
[Automatic Differentiation](@ref)).

```jldoctest
```jldoctest ad-voigt
julia> using Tensors, ForwardDiff
julia> fun(X::SymmetricTensor{2}) = X;
julia> A = rand(SymmetricTensor{2,2});
```

# Differentiation of a tensor directly (correct)
Differentiation of a tensor directly (correct):
```jldoctest ad-voigt
julia> tovoigt(gradient(fun, A))
3×3 Matrix{Float64}:
1.0 0.0 0.0
0.0 1.0 0.0
0.0 0.0 0.5
```

# Converting to Voigt format, perform differentiation, convert back (WRONG!)
Converting to Voigt format, perform differentiation, convert back (WRONG!):
```jldoctest ad-voigt
julia> ForwardDiff.jacobian(
v -> tovoigt(fun(fromvoigt(SymmetricTensor{2,2}, v))),
tovoigt(A)
Expand All @@ -282,8 +286,10 @@ julia> ForwardDiff.jacobian(
1.0 0.0 0.0
0.0 1.0 0.0
0.0 0.0 1.0
```

# Converting to Mandel format, perform differentiation, convert back (correct)
Converting to Mandel format, perform differentiation, convert back (correct)
```jldoctest ad-voigt
julia> tovoigt(
frommandel(SymmetricTensor{4,2},
ForwardDiff.jacobian(
Expand Down
28 changes: 14 additions & 14 deletions src/automatic_differentiation.jl
Original file line number Diff line number Diff line change
Expand Up @@ -474,8 +474,8 @@ julia> A = rand(SymmetricTensor{2, 2});
julia> ∇f = gradient(norm, A)
2×2 SymmetricTensor{2, 2, Float64, 3}:
0.434906 0.56442
0.56442 0.416793
0.374672 0.63107
0.63107 0.25124
julia> ∇f, f = gradient(norm, A, :all);
```
Expand Down Expand Up @@ -507,20 +507,20 @@ julia> A = rand(SymmetricTensor{2, 2});
julia> ∇∇f = hessian(norm, A)
2×2×2×2 SymmetricTensor{4, 2, Float64, 9}:
[:, :, 1, 1] =
0.596851 -0.180684
-0.180684 -0.133425
0.988034 -0.271765
-0.271765 -0.108194
[:, :, 2, 1] =
-0.180684 0.133546
0.133546 -0.173159
-0.271765 0.11695
0.11695 -0.182235
[:, :, 1, 2] =
-0.180684 0.133546
0.133546 -0.173159
-0.271765 0.11695
0.11695 -0.182235
[:, :, 2, 2] =
-0.133425 -0.173159
-0.173159 0.608207
-0.108194 -0.182235
-0.182235 1.07683
julia> ∇∇f, ∇f, f = hessian(norm, A, :all);
```
Expand Down Expand Up @@ -594,15 +594,15 @@ julia> x = rand(Vec{3});
julia> f(x) = norm(x);
julia> laplace(f, x)
1.7833701103136868
2.9633756571179273
julia> g(x) = x*norm(x);
julia> laplace.(g, x)
3-element Vec{3, Float64}:
2.107389336871036
2.7349658311504834
2.019621767876747
1.9319830062026155
3.2540895437409754
1.2955087437219237
```
"""
function laplace(f::F, v) where F
Expand Down
8 changes: 4 additions & 4 deletions src/eigen.jl
Original file line number Diff line number Diff line change
Expand Up @@ -50,13 +50,13 @@ julia> E = eigen(A);
julia> E.values
2-element Vec{2, Float64}:
-0.1883547111127678
1.345436766284664
-0.27938877799585415
0.8239521616782797
julia> E.vectors
2×2 Tensor{2, 2, Float64, 4}:
-0.701412 0.712756
0.712756 0.701412
-0.671814 0.74072
0.74072 0.671814
```
"""
LinearAlgebra.eigen(::SymmetricTensor{2})
Expand Down
64 changes: 32 additions & 32 deletions src/math_ops.jl
Original file line number Diff line number Diff line change
Expand Up @@ -10,12 +10,12 @@ Computes the norm of a tensor.
```jldoctest
julia> A = rand(Tensor{2,3})
3×3 Tensor{2, 3, Float64, 9}:
0.590845 0.460085 0.200586
0.766797 0.794026 0.298614
0.566237 0.854147 0.246837
0.325977 0.894245 0.953125
0.549051 0.353112 0.795547
0.218587 0.394255 0.49425
julia> norm(A)
1.7377443667834922
1.8223398556552728
```
"""
@inline LinearAlgebra.norm(v::Vec) = sqrt(dot(v, v))
Expand Down Expand Up @@ -61,12 +61,12 @@ Computes the determinant of a second order tensor.
```jldoctest
julia> A = rand(SymmetricTensor{2,3})
3×3 SymmetricTensor{2, 3, Float64, 6}:
0.590845 0.766797 0.566237
0.766797 0.460085 0.794026
0.566237 0.794026 0.854147
0.325977 0.549051 0.218587
0.549051 0.894245 0.353112
0.218587 0.353112 0.394255
julia> det(A)
-0.1005427219925894
-0.002539324113350679
```
"""
@inline LinearAlgebra.det(t::SecondOrderTensor{1}) = @inbounds t[1,1]
Expand All @@ -86,15 +86,15 @@ Computes the inverse of a second order tensor.
```jldoctest
julia> A = rand(Tensor{2,3})
3×3 Tensor{2, 3, Float64, 9}:
0.590845 0.460085 0.200586
0.766797 0.794026 0.298614
0.566237 0.854147 0.246837
0.325977 0.894245 0.953125
0.549051 0.353112 0.795547
0.218587 0.394255 0.49425
julia> inv(A)
3×3 Tensor{2, 3, Float64, 9}:
19.7146 -19.2802 7.30384
6.73809 -10.7687 7.55198
-68.541 81.4917 -38.8361
-587.685 -279.668 1583.46
-411.743 -199.494 1115.12
588.35 282.819 -1587.79
```
"""
@generated function Base.inv(t::Tensor{2, dim}) where {dim}
Expand Down Expand Up @@ -191,13 +191,13 @@ second order tensor `S`, such that `√S ⋅ √S == S`.
```jldoctest
julia> S = rand(SymmetricTensor{2,2}); S = tdot(S)
2×2 SymmetricTensor{2, 2, Float64, 3}:
0.937075 0.887247
0.887247 0.908603
0.407718 0.298993
0.298993 0.349237
julia> sqrt(S)
2×2 SymmetricTensor{2, 2, Float64, 3}:
0.776178 0.578467
0.578467 0.757614
0.578172 0.270989
0.270989 0.525169
julia> √S ⋅ √S ≈ S
true
Expand Down Expand Up @@ -232,12 +232,12 @@ Computes the trace of a second order tensor.
```jldoctest
julia> A = rand(SymmetricTensor{2,3})
3×3 SymmetricTensor{2, 3, Float64, 6}:
0.590845 0.766797 0.566237
0.766797 0.460085 0.794026
0.566237 0.794026 0.854147
0.325977 0.549051 0.218587
0.549051 0.894245 0.353112
0.218587 0.353112 0.394255
julia> tr(A)
1.9050765715072775
1.6144775244804341
```
"""
@generated function LinearAlgebra.tr(S::SecondOrderTensor{dim}) where {dim}
Expand All @@ -259,15 +259,15 @@ based on the additive decomposition.
```jldoctest
julia> A = rand(SymmetricTensor{2,3})
3×3 SymmetricTensor{2, 3, Float64, 6}:
0.590845 0.766797 0.566237
0.766797 0.460085 0.794026
0.566237 0.794026 0.854147
0.325977 0.549051 0.218587
0.549051 0.894245 0.353112
0.218587 0.353112 0.394255
julia> vol(A)
3×3 SymmetricTensor{2, 3, Float64, 6}:
0.635026 0.0 0.0
0.0 0.635026 0.0
0.0 0.0 0.635026
0.538159 0.0 0.0
0.0 0.538159 0.0
0.0 0.0 0.538159
julia> vol(A) + dev(A) ≈ A
true
Expand All @@ -286,12 +286,12 @@ julia> A = rand(Tensor{2, 3});
julia> dev(A)
3×3 Tensor{2, 3, Float64, 9}:
0.0469421 0.460085 0.200586
0.766797 0.250123 0.298614
0.566237 0.854147 -0.297065
-0.065136 0.894245 0.953125
0.549051 -0.0380011 0.795547
0.218587 0.394255 0.103137
julia> tr(dev(A))
0.0
5.551115123125783e-17
```
"""
@inline function dev(S::SecondOrderTensor)
Expand Down
Loading

0 comments on commit b6ae45c

Please sign in to comment.