-
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[FIX backburner] Avoids spinning up unnecessary run loops via run.join #4698
Conversation
I don't think this has much statistical value, for the most part outliers can be largely ignored (on the basis that our environment is very noisy, and not totally isolated). That is Unless they happen to consistently and in non trivial ways skew the distribution. My feeling is that for these |
This though, I think has value. Both in cleaning up the timeline, grouping similar work together. As long as this isn't a breaking change (at first glance it doesn't look like it). It sounds like a good change! |
I'm wondering if the 5 number summary gets abit too murky when dealing with less diverse distribution comparisons. There are other approaches that we can consider when comparing. Such as normalizing then comparing, rather then comparing non-normalized distributions. Typical Gaussian normalization can hide skew so it is absolutely not perfect, as nearly nothing is actually distributed this way. More often real world distributions are log normal distributed... Ultimately statistics is silly:p. But it's likely the only tool we have for this sort of thing. Maybe at some point we sit down and brainstorm abit. My goal here isn't to be pedantic, but to try to avoid common pitfalls us humans run into when our brain tries to reason about these sorts of things. |
Overall this change looks good. 👍 |
@@ -2270,37 +2270,36 @@ Store = Service.extend({ | |||
*/ | |||
_push(data) { | |||
let token = heimdall.start('store._push'); | |||
let included = data.included; | |||
let i, length; | |||
let ret = this._backburner.join(() => { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you rename this back to internalModel
326bbf2
to
7d3fdf0
Compare
7d3fdf0
to
c2b5fb9
Compare
Notes:
Numbers are measuring
store._push
for238
records of a single type, in milliseconds and from 35 runs comparing current master to this PR.current master
PR
cc @stefanpenner @krisselden
Edit
I realized I should have been benchmarking against the
complex
scenario in which we actually have relationships. The complex scenario loads 34 primary records with 6 related record per primary record, for the same total of 238 records. New numbers below, before going "wait, where's the improvement?!" please see updated analysis in the related PR #4668current master
PR