-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add plain Gradient Descent optimizer #6459
Conversation
# pylint: disable=unused-argument | ||
def optimize( | ||
self, | ||
num_vars, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why not make num_vars
optional, since it is unused? It is unused in the scipy optimizer, too (maybe others). Maybe fixing this in a uniform way is outside the scope of this PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Excellent question, this is part of #6382 🙂
* implement plain Gradient Descent optimizer * replace leftover "SPSA" -> "GradientDescent" * update docstring in init * lint * add a proper docstring with examples * add callback test, mark slow test as slow_test * add reno * add test for learning rate as iterator * add stepsize to callback * Update qiskit/algorithms/optimizers/gradient_descent.py Add copyright * fix SPSA -> Gradient descent in docstrings Co-authored-by: Steve Wood <40241007+woodsp-ibm@users.noreply.github.com> Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>
* implement plain Gradient Descent optimizer * replace leftover "SPSA" -> "GradientDescent" * update docstring in init * lint * add a proper docstring with examples * add callback test, mark slow test as slow_test * add reno * add test for learning rate as iterator * add stepsize to callback * Update qiskit/algorithms/optimizers/gradient_descent.py Add copyright * fix SPSA -> Gradient descent in docstrings Co-authored-by: Steve Wood <40241007+woodsp-ibm@users.noreply.github.com> Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>
Summary
Add a plain gradient descent optimizer.
Details and comments
Any collection of optimizers need a standard gradient descent scheme, at least for reference! 🙂 (And I need it for the QML summer school.)
Todo: