You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Following our Slack discussion: thread = True() makes the broadcasting in the time integration significantly faster on multiple threads, but it's not used in the elixirs.
Without multithreaded time integration:
And issues when switching to Base threads since OrdinaryDiffEq.jl will still use Polyester.jl
We should check these aspects before switching everything
For my multi-GPU prototype of TrixiParticles, I built a custom wrapper data type with a custom broadcasting style to make sure that all broadcasting in the time integration itself is also done on multiple GPUs with @threaded.
I can easily adapt this to have a ThreadedBroadcastArray data type that defines broadcasting with Trixi.@threaded. This way, all time integration schemes are automatically multithreaded in the same way as the rest of Trixi, even if (when?) we move away from Polyester.jl.
We only need to change
Following our Slack discussion:
thread = True()
makes the broadcasting in the time integration significantly faster on multiple threads, but it's not used in the elixirs.Without multithreaded time integration:
With
thread = True()
:As @JoshuaLampert pointed out:
As @ranocha pointed out:
For my multi-GPU prototype of TrixiParticles, I built a custom wrapper data type with a custom broadcasting style to make sure that all broadcasting in the time integration itself is also done on multiple GPUs with
@threaded
.I can easily adapt this to have a
ThreadedBroadcastArray
data type that defines broadcasting withTrixi.@threaded
. This way, all time integration schemes are automatically multithreaded in the same way as the rest of Trixi, even if (when?) we move away from Polyester.jl.We only need to change
Trixi.jl/src/semidiscretization/semidiscretization.jl
Line 95 in 1e1f643
to
Would this be desired in Trixi.jl? @ranocha @sloede
The text was updated successfully, but these errors were encountered: