Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Evaluating julia code using eval_code can be slow if the number of local variables is very large #517

Closed
KristofferC opened this issue Feb 21, 2022 · 1 comment · Fixed by #518

Comments

@KristofferC
Copy link
Member

In some cases with very big functions with lots of local variables, this code in eval_code:

eval_expr = Expr(:let,
Expr(:block, map(x->Expr(:(=), x...), [(v.name, QuoteNode(v.value isa Core.Box ? v.value.contents : v.value)) for v in vars])...,
map(x->Expr(:(=), x...), [(Symbol("%$i"), QuoteNode(data.ssavalues[i])) for i in defined_ssa])...,
map(x->Expr(:(=), x...), [(Symbol("@_$i"), QuoteNode(data.locals[i].value)) for i in defined_locals])...),

can become quite slow (on the order of 10 seconds). There might be a smarter way to do this than to generate a big expression with all the locals (and slots and SSA values) and eval it so that the time of execution scales with the size of the input expression and not by the size of the frame. Interpreting the expression in the given frame (non-recursively) might work.

@KristofferC
Copy link
Member Author

KristofferC commented Feb 21, 2022

It turns out that the let statement in

https://github.com/JuliaDebug/Infiltrator.jl/blob/6e91ec59ae75bea94dc0a0579bc340c839d7a3c8/src/Infiltrator.jl#L462-L493

makes this way slower than without it, probably because the block then gets compiled instead of interpreted. Without the let even for the big function that made me open this issue, the code evaluates in ~0.01 seconds which is acceptable. So the question is, what's a good way of shielding the evaluation in a scope but still allow the values to be GCed. Infiltrator uses a new module every time but I am not sure modules ever gets GCed so that could rack up quite a lot of memory if the function is big and one does many evaluations.

@timholy, could the be a

let
    @force_interpret
    ...
end

?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
1 participant