Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add plain Gradient Descent optimizer #6459

Merged
merged 14 commits into from
Jun 1, 2021
Merged

Conversation

Cryoris
Copy link
Contributor

@Cryoris Cryoris commented May 25, 2021

Summary

Add a plain gradient descent optimizer.

Details and comments

Any collection of optimizers need a standard gradient descent scheme, at least for reference! 🙂 (And I need it for the QML summer school.)

Todo:

  • Test the callback
  • Add example in the docstring
  • Reno

@Cryoris Cryoris force-pushed the gradient-descent branch from 925829d to 6e18912 Compare May 25, 2021 11:54
@Cryoris Cryoris added Changelog: New Feature Include in the "Added" section of the changelog mod: algorithms Related to the Algorithms module labels May 25, 2021
@Cryoris Cryoris changed the title [WIP] Add plain Gradient Descent optimizer Add plain Gradient Descent optimizer May 26, 2021
@Cryoris Cryoris marked this pull request as ready for review May 26, 2021 08:57
@Cryoris Cryoris requested review from manoelmarques, woodsp-ibm and a team as code owners May 26, 2021 08:57
# pylint: disable=unused-argument
def optimize(
self,
num_vars,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not make num_vars optional, since it is unused? It is unused in the scipy optimizer, too (maybe others). Maybe fixing this in a uniform way is outside the scope of this PR.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Excellent question, this is part of #6382 🙂

@Cryoris Cryoris added this to the 0.18 milestone Jun 1, 2021
@mergify mergify bot merged commit a9289c0 into Qiskit:main Jun 1, 2021
ElePT pushed a commit to ElePT/qiskit that referenced this pull request Jun 27, 2023
* implement plain Gradient Descent optimizer

* replace leftover "SPSA" -> "GradientDescent"

* update docstring in init

* lint

* add a proper docstring with examples

* add callback test, mark slow test as slow_test

* add reno

* add test for learning rate as iterator

* add stepsize to callback

* Update qiskit/algorithms/optimizers/gradient_descent.py

Add copyright

* fix SPSA -> Gradient descent in docstrings

Co-authored-by: Steve Wood <40241007+woodsp-ibm@users.noreply.github.com>
Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>
ElePT pushed a commit to ElePT/qiskit-algorithms-test that referenced this pull request Jul 17, 2023
* implement plain Gradient Descent optimizer

* replace leftover "SPSA" -> "GradientDescent"

* update docstring in init

* lint

* add a proper docstring with examples

* add callback test, mark slow test as slow_test

* add reno

* add test for learning rate as iterator

* add stepsize to callback

* Update qiskit/algorithms/optimizers/gradient_descent.py

Add copyright

* fix SPSA -> Gradient descent in docstrings

Co-authored-by: Steve Wood <40241007+woodsp-ibm@users.noreply.github.com>
Co-authored-by: mergify[bot] <37929162+mergify[bot]@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Changelog: New Feature Include in the "Added" section of the changelog mod: algorithms Related to the Algorithms module
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants