-
Notifications
You must be signed in to change notification settings - Fork 3.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[7.0.0] Performance regression from 6.8 (over 3x slower in our test suite) #15853
Comments
Yeah, I can confirm this as well - our tests are much slower under 7.0. In our case, it's closer to 50% slower, although we have much longer tests overall. This is only happening in CI (Github Actions) for us, local runs |
Probably related: #15827 |
We also noticed a heavy increase in run time in our test suite (13mins to 28 mins = 2x slower) when upgrading from 6.8.0 to 7.0.0. Very interested in a workaround / fix for this. |
I confirm, rollback to 6.9. |
I can also confirm the increase in duration when switching from before:
after:
|
We are observing the same thing. On my workstation it's as fast as ever, but on our CI machine (with fewer resources) it's taking 3x as long. Maybe it's using more memory? |
CI was definitely the slowest for us, but we also saw the slow down locally too, just not as bad |
I narrowed it down above #15853 (comment) (editing the comment) |
Same here, almost 1h extra running the same tests on the 6.9.1 and on the 7.0 and 7.0.1. |
I can't see any obvious regression in the times it takes to run the cypress tests themselves - but I don't think that is uncommon. Cypress uses itself for testing, but all of their tests are fast because they test on tiny pages with very little js and additional resources. Where as real-world cypress usage is on pages that are resource heavy. My own app is a very heavy SPA with 3mb of javascript and we get a 7x speed regression on low spec CI machines and no regression on fast multi-core developer machines. |
I believe #15841 might play a role here. Depending on the testing setup (locally/with backend API running/CI) those incorrect network requests might require a timeout, which might be the cause of the slowdown. |
We don’t use intercept or any stubbing and still have a big slowdown |
There was a period of time where we accidentally did not squash PR's, but that shouldn't be the case any more. Most of the commits are likely due to us using a monorepo as you're seeing commits to things outside of the binary. The |
I apologize for the delay. If you haven't seen my testing in #15779 (comment), take a look. After a bunch of playing with it, I arrived at the following system for automatically bisecting, assuming we're looking for differences in Chrome/Electron CPU time. This gist contains the final run scripts. This was run on an AWS
The process was initiated via:
considering 0399a6e (tag 7.0.0) as known bad and 7331721 (tag 6.8.0) as known good. This process turned up b52ac98 as the first bad commit. This commit is notable as it both updates to a new major version of Electron (12.0, from 11.3.0) and a new major version of Node (14.16.0, from 12.18.3). However, it doesn't seem likely that upgrading Electron would change Chrome performance whatsoever.
Looking at the numbers, that does indeed seem to be the case, as Chrome numbers did not change significantly other than one run being significantly faster than the others. This also points to another potential problem. As seen in #15779 (comment), both Chrome and Electron were noted to slow down with each version upgrade.
TLDR: |
Anyone tried the latest |
Not sure how this relates, but noticed extremely high CPU usage for my Mac, when running test from local terminal, on 7.1.0 version. But I gotta double check other apps in Mac. There wasn't such high CPU usage thing in earlier versions. Not sure now. |
@GC-Mark we did, same result and time as 7.0 and 7.0.1 |
Just for clarity, 7.1.0 does not include any changes we expect would influence performance. |
Ha, I bet that's what you said about version |
I confirm that, too. Tests with both 7.0.1 and 7.1.0 take at least double the time as compared to 6.9.1. |
Same here switching from version 6.8.0 to 7.1.0 directly |
Same, jumped from 6.2.0 to 7.1.0, all suites take double the time to finish |
Hey I don't want to pollute this thread with offtopic stuff but the TLDR is to pass values from cypress (eg: elapsed time) to a statsd server (search |
@agg23, any update on current investigation status, or place to look for updates? Thx in advance. |
@flotwig is now the primary investigator on this issue. I am unaware of any new breakthroughs. |
I have the same issue. I have a mock api with intercept and fixture features, after update to 7.0 it just stopped working! Running tests with mock now slower than with real api! |
Fixes (avoids?) #15853 Electron v12.0.0-beta.16 and above contain an unknown bug causing a major slowdown when video recording is enabled. For now maybe we can downgrade Electron to this last known good version, v12.0.0-beta.14
Will this commit give us any improvement? |
@lukeapage yeah, this slowdown seems to be related to video recording, and there was a bug in Cypress 7 causing it to always be capturing frames of video. So with 04e854e you'll be able to at least disable video to avoid this issue. |
Fixes (avoids?) #15853 Electron v12.0.0-beta.16 and above contain an unknown bug causing a major slowdown when video recording is enabled. For now maybe we can downgrade Electron to this last known good version, v12.0.0-beta.14
The code for this is done in cypress-io/cypress#16113, but has yet to be released. |
Hey ! This issue is non-related to the video recording.... At our side we never recorded videos with cypress and we encountered this issue when upgrading to 7.1.0 among other issues that we opened...... @flotwig |
@mmonteiroc #15853 (comment) |
Released in This comment thread has been locked. If you are still experiencing this issue after upgrading to |
I have split this off from the performance regression issue related to
6.6
-6.7
#15779Can also confirm our tests are taking way longer in
v7.0.0
than the were inv6.8.0
Also, because the tests are taking longer, we get random failing tests due to timeout issues.
Im not sure what info i can give you to help you debug this, but if there is anything you need me to do, just ask.
Our last few CI runs
v6.8.0
v7.0.0
(including a random failing test due to a timeout)
![Screenshot 2021-04-07 at 10 47 28](https://user-images.githubusercontent.com/1477806/113847002-f23a7680-978e-11eb-8e00-12ca325ba105.png)
The text was updated successfully, but these errors were encountered: