You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When running my grunt-phantomas test with the --debug flag, I noticed the tests are not executed synchronously. Instead it looks like they're run all at once. iTerm and the activity monitor shows that all instances of phantomjs are spun up at once for the number of runs requested.
This means instead getting a better mean and average by doing more runs, I'm actually impacting the performance and timings of all runs by specifying a large number of runs. 🙅
PHANTOMAS EXECUTION(S) STARTED.
Executing phantomas ( 5 times ) with following parameters:
{"timeout":10,"allow-domain":"content.jwplatform.com,p.jwpcdn.com,i.n.jwpltx.com"}
[D] server GET /test/performance/setup-responsive-autostart.html 200 1882 - 13 ms
[D] server GET /test/performance/setup-responsive-autostart.html 200 1882 - 9 ms
[D] server GET /test/performance/setup-responsive-autostart.html 200 1882 - 1 ms
[D] server GET /test/performance/phantomas/test-module.js 200 10260 - 2 ms
[D] server GET /test/performance/phantomas/test-module.js 200 10260 - 4 ms
[D] server GET /bin-release/test.js 200 72325 - 6 ms
[D] server GET /test/assets/providers/html5.provider.timer.js 200 3316 - 4 ms
[D] server GET /bin-release/test.js 200 72325 - 4 ms
[D] server GET /test/assets/providers/html5.provider.timer.js 200 3316 - 2 ms
[D] server GET /test/performance/phantomas/test-module.js 200 10260 - 1 ms
[D] server GET /bin-release/test.js 200 72325 - 2 ms
[D] server GET /test/assets/providers/html5.provider.timer.js 200 3316 - 1 ms
[D] server GET /bin-release/test.html5.js 200 172412 - 3 ms
[D] server GET /bin-release/test.html5.js 200 172412 - 5 ms
[D] server GET /bin-release/test.html5.js 200 172412 - 3 ms
[D] server GET /test/performance/setup-responsive-autostart.html 200 1882 - 1 ms
[D] server GET /test/performance/phantomas/test-module.js 200 10260 - 4 ms
[D] server GET /test/assets/providers/html5.provider.timer.js 200 3316 - 3 ms
[D] server GET /bin-release/test.js 200 72325 - 4 ms
[D] server GET /test/performance/setup-responsive-autostart.html 200 1882 - 3 ms
[D] server GET /test/performance/phantomas/test-module.js 200 10260 - 4 ms
[D] server GET /test/assets/providers/html5.provider.timer.js 200 3316 - 5 ms
[D] server GET /bin-release/test.js 200 72325 - 5 ms
[D] server GET /bin-release/test.html5.js 200 172412 - 1 ms
[D] server GET /bin-release/test.html5.js 200 172412 - 1 ms
>> 5 Phantomas execution(s) done -> checking results:
>> Phantomas execution successful.
>> Phantomas execution successful.
>> Phantomas execution successful.
>> Phantomas execution successful.
>> Phantomas execution successful.
The text was updated successfully, but these errors were encountered:
When running my grunt-phantomas test with the
--debug
flag, I noticed the tests are not executed synchronously. Instead it looks like they're run all at once. iTerm and the activity monitor shows that all instances of phantomjs are spun up at once for the number of runs requested.This means instead getting a better mean and average by doing more runs, I'm actually impacting the performance and timings of all runs by specifying a large number of runs. 🙅
The text was updated successfully, but these errors were encountered: