-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unnecessary invalidation of event model caches #12
Comments
Original comment by Philipp Reinkemeier (Bitbucket: preinkemeier, GitHub: preinkemeier). I attached a file containing a version of the function _invalidate_event_model_caches(task), which i believe should fix the problem. |
Original comment by Johannes Schlatow (Bitbucket: jschlatow, GitHub: jschlatow). Thank you very much for the detailed report. I understand your argumentation. I could even go a bit further and argue that this invalidation is not necessary at all because propagation always replaces the entire event model object and hence also the caches. I'm a bit hesitant applying these changes though. Let me elaborate on this: In my opinion, there is a trade-off between straightforward and easy-to-read code and performance enhancement. With respect to the latter, I doubt that it will enhance the performance very much in general. The event model caches avoid recomputing delta/eta values. This becomes important for complex event model functions. However, caching is only effective if the input event model remains constant during multiple iterations, which, I suppose, rarely happens for dependent tasks. On the other hand, recomputing delta/eta values for parametrised event models (e.g. PJd) shouldn't diminish the performance very much. If I don't missed a corner case, I think it should be safe adding a switch that disables this invalidation entirely so that the users can decide whether to "keep it safe" or to increase the performance. What do you think about this? |
Original comment by Philipp Reinkemeier (Bitbucket: preinkemeier, GitHub: preinkemeier). Of course it is up to you, but i do not understand the argument. Of course i understand that there is a trade-off between performance improvement and easy-to-read and -maintain code. But in this case i even found it irritating and misleading why the cache was invalidated, because from a semantic point of view it makes no sense. So yes, it may not have a notable performance improvement, but at least for me i cannot find why in this case there would be a trade-off to consider. Not invalidating the cache makes it explicit in what cases the cached values can be relied upon, and in which cases not. |
Original report by Philipp Reinkemeier (Bitbucket: preinkemeier, GitHub: preinkemeier).
In the function _invalidate_event_model_caches(task) in file analysis.py, the event model caches of task are invalidated along with the event model caches of all its reachable successor tasks. This function is only called by the function _propagate(task, task_results) defined in the same file. That function is turn is used during initialization of the class GlobalAnalysisState and the central function analyze_system(system, task_results=None, only_dependent_tasks=False, progress_hook=None).
In both cases, there is no need to invalidate the caches of the input event model of the task the function _propagate is called with: The jitter and busy times of task have changed, so its output (!) event model has changed and therefore the input event models of all its direct and indirect successor tasks.
The following version of the function _invalidate_event_model_caches(task) would fix that:
def _invalidate_event_model_caches(task):
""" Invalidate all event model caches """
for t in util.breadth_first_search(task):
if not ( t is task ):
t.invalidate_event_model_cache()
Note that the set returned by util.breadth_first_search(task) also contains task, thus it must be filtered out. One could also change that function to NOT include the argument task in its resulting set, but since that function is used at some different places, i do not know whether this would break anything.
The text was updated successfully, but these errors were encountered: