-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add a performance benchmarking tool #419
Comments
There is also pytest-benchmark which could integrate with the test suite, but I think it might be better to have a separate benchmark suite because a lot of the tests doesn't really reflect realistic usage. As far as running benchmarks in a CI service, this thread has some insights: astropy/astropy#6149 Is there still interest in benchmarking pvlib? I was playing around with asv a bit tonight and would be happy to pick up where #530 left off. |
Kevin, benchmarking was also proposed as a GSoC idea, check out the wiki. |
Definitely interested! I might soon have dedicated hardware available for this if/when the software is ready for it. And feel free to take over my pr or start fresh. |
Oh sorry I didn't realize, thanks @mikofski. If it doesn't get chosen as a GSOC project then I'll pick this up. |
Perhaps you could help mentor? Ok with me if you want to take it on anytime, don't let GSoC interfere 😀 thanks! |
I’m not optimistic this is going to be picked up in GSOC
…On Sat, Mar 21, 2020 at 8:59 AM Mark Mikofski ***@***.***> wrote:
Perhaps you could help mentor? Ok with me if you want to take it on
anytime, don't let GSoC interfere 😀 thanks!
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#419 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABBOER4DLTELP76SWZW7STLRITP7XANCNFSM4EO2CKKQ>
.
|
After playing around with asv a bit more, I'd like to get feedback on an implementation plan. Some quick background for people who haven't used asv before: it's sort of like pytest, you create a folder of code snippets you want to benchmark and invoke asv on them to generate a set of timings. Those timings are associated with the machine and the pvlib git commit used to run the benchmarks. When new code is committed, the benchmarks are rerun, creating a history of timings. This history is then used to generate an HTML report like this example from astropy that shows the timing history for each benchmark. It also has some other nice features like regression detection logic and a binary search to determine which commit introduced the regression. A goal that seems useful and achievable to me is to have a machine somewhere that does a nightly benchmark test on whatever commits were added to master that day. Running the benchmarks manually and inspecting the results is a bit of a hassle, so I want it to be automated and generating results consistently over time so that they are available whenever they're needed. Choice of machine: In terms of timing stability, a dedicated machine is ideal. However, a virtual cloud server is more convenient. I've been running a basic pvlib benchmark suite on a cheapo cloud VPS and getting surprisingly stable results -- not as stable as the astropy timings, but good enough to detect significant shifts. Could be that I'm just getting lucky and running during stable conditions. Tracking timing history: the timing history is stored in json files, so it is reasonable to use git to keep track of them over time. asv also has a convenient command to push the HTML report to the Long term it might be cool to somehow integrate benchmarks with a CI PR check, but I think it makes sense to take smaller steps and just benchmark master for now to get more familiar with asv and figure out how to get the most value from it. So to summarize, here's what I'm proposing:
Does this sound like a good plan? |
Wow, great research Kevin! Sounds good to me. My preference would be for a dedicated machine, paid for by a dedicated source. Who is paying for the cheapo vps? |
I have a squad of small VPSs I use for personal projects and have just been testing on one of those. The cost is a drop in the bucket, not worth worrying about. Agreed that a dedicated machine is preferable. |
We should look into using a package like asv for performance benchmarks. PRs such as #409 would benefit from a standard set of performance tests that cover multiple use cases.
The text was updated successfully, but these errors were encountered: