Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix PrepSelPrep differentiability with parameter-shift #6423

Merged
merged 9 commits into from
Oct 23, 2024

Conversation

andrijapau
Copy link
Contributor

@andrijapau andrijapau commented Oct 21, 2024

Context:

Prior to this fix, differentiating PrepSelPrep with diff_method="parameter-shift" resulted in zeros (which was inconsistent with backprop),

@qml.qnode(qml.device("default.qubit"), diff-method="parameter-shift")
def circuit(params):
    H = qml.Hamiltonian(params, [qml.Z(2), qml.X(1) @ qml.X(2)])
    qml.PrepSelPrep(H, control=[0])
    return qml.expval(qml.PauliZ(0))

>>> x = qml.numpy.array([0.25, 0.75])
>>> qml.jacobian(circuit)(x)
array([0.,  0.])

Description of the Change:

One line change to add PrepSelPrep.grad_method=None 😺. This results in,

@qml.qnode(qml.device("default.qubit"), diff-method="parameter-shift")
def circuit(params):
    H = qml.Hamiltonian(params, [qml.Z(2), qml.X(1) @ qml.X(2)])
    qml.PrepSelPrep(H, control=[0])
    return qml.expval(qml.PauliZ(0))

>>> x = qml.numpy.array([0.25, 0.75])
>>> qml.jacobian(circuit)(x)
array([-1.5, 0.5])

Reason this works: It seems that parameter-shift tries to update the parameters and fails to do so. Setting grad_method=None forces it to decompose the operator before differentiating it's data. This does suggest some areas of improvement for Operator.grad_method.

The xfails were also removed from the relevant tests. Also, certain test cases would result in NaNs upon differentiation with any method, so I skipped those cases as they are a known limitation of MottonenStatePreparation.

Benefits: Gradient results are now consistent with backprop.

Possible Drawbacks: None.

Related GitHub Issues: Fixes #6331

[sc-74920]

@andrijapau andrijapau self-assigned this Oct 21, 2024
Copy link
Contributor

Hello. You may have forgotten to update the changelog!
Please edit doc/releases/changelog-dev.md with:

  • A one-to-two sentence description of the change. You may include a small working example for new features.
  • A link back to this PR.
  • Your name (or GitHub username) in the contributors section.

@andrijapau andrijapau changed the title Fix PrepSelPrep differentiability with parameter-shift [WIP] Fix PrepSelPrep differentiability with parameter-shift Oct 21, 2024
@andrijapau andrijapau marked this pull request as ready for review October 21, 2024 19:16
@andrijapau andrijapau changed the title [WIP] Fix PrepSelPrep differentiability with parameter-shift Fix PrepSelPrep differentiability with parameter-shift Oct 21, 2024
Copy link

codecov bot commented Oct 21, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 99.38%. Comparing base (21f916a) to head (309b825).
Report is 321 commits behind head on master.

Additional details and impacted files
@@            Coverage Diff             @@
##           master    #6423      +/-   ##
==========================================
- Coverage   99.70%   99.38%   -0.32%     
==========================================
  Files         447      447              
  Lines       42428    42429       +1     
==========================================
- Hits        42303    42169     -134     
- Misses        125      260     +135     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@andrijapau andrijapau requested a review from astralcai October 22, 2024 16:24
Copy link
Contributor

@astralcai astralcai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! 🚀

Copy link
Contributor

@albi3ro albi3ro left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

@albi3ro albi3ro merged commit 9e1e08e into master Oct 23, 2024
39 of 40 checks passed
@albi3ro albi3ro deleted the 6331-bug-prepselprep-diff branch October 23, 2024 13:32
mudit2812 pushed a commit that referenced this pull request Nov 11, 2024
**Context:**

Prior to this fix, differentiating `PrepSelPrep` with
`diff_method="parameter-shift"` resulted in zeros (which was
inconsistent with `backprop`),

```python
@qml.qnode(qml.device("default.qubit"), diff-method="parameter-shift")
def circuit(params):
    H = qml.Hamiltonian(params, [qml.Z(2), qml.X(1) @ qml.X(2)])
    qml.PrepSelPrep(H, control=[0])
    return qml.expval(qml.PauliZ(0))

>>> x = qml.numpy.array([0.25, 0.75])
>>> qml.jacobian(circuit)(x)
array([0.,  0.])
```

**Description of the Change:**

One line change to add `PrepSelPrep.grad_method=None` 😺. This results
in,

```python
@qml.qnode(qml.device("default.qubit"), diff-method="parameter-shift")
def circuit(params):
    H = qml.Hamiltonian(params, [qml.Z(2), qml.X(1) @ qml.X(2)])
    qml.PrepSelPrep(H, control=[0])
    return qml.expval(qml.PauliZ(0))

>>> x = qml.numpy.array([0.25, 0.75])
>>> qml.jacobian(circuit)(x)
array([-1.5, 0.5])
```

_Reason this works:_ It seems that parameter-shift tries to update the
parameters and fails to do so. Setting `grad_method=None` forces it to
decompose the operator before differentiating it's data. This does
suggest some areas of improvement for `Operator.grad_method`.

The `xfails` were also removed from the relevant tests. Also, certain
test cases would result in `NaN`s upon differentiation with *any*
method, so I skipped those cases as they are a *known* limitation of
`MottonenStatePreparation`.

**Benefits:** Gradient results are now consistent with `backprop`.

**Possible Drawbacks:** None.

**Related GitHub Issues:** Fixes #6331 

[sc-74920]
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] Differentiating PrepSelPrep using parameter-shift returns 0s
3 participants