-
Notifications
You must be signed in to change notification settings - Fork 33
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature Request]: Warn user directly when custom loss is not differentiable #52
Comments
@Frightera, |
The PR is in progress. |
@Frightera, |
This issue is stale because it has been open for 14 days with no activity. It will be closed if no further activity occurs. Thank you. |
The PR number is keras-team/keras#17753 |
@Frightera After a brief look at the PR, I wonder if we could generalize this feature to custom metrics and custom losses as well? |
@haifeng-jin Right now it checks the custom losses provided by the user, I am not sure how it would be useful for custom metrics? |
Sorry, only custom layers and losses. Custom metrics do not need to be differentiable. |
@haifeng-jin I'll extend this to support custom layers aswell. |
Hi @haifeng-jin, Are there any updates on this issue & PR? Thanks |
@Frightera, |
This issue is stale because it has been open for 14 days with no activity. It will be closed if no further activity occurs. Thank you. |
This issue was closed because it has been inactive for 28 days. Please reopen if you'd like to work on this further. |
System information.
TensorFlow version (you are using): TF 2.11
Are you willing to contribute it (Yes/No) : Yes
Describe the feature and the current behavior/state.
Some people write custom loss functions from scratch that is not being differentiable. Then this leads getting
None
gradients while training. Warning users directly could be a useful feature instead of sayingno gradients are provided
.There are way too many Stackoverflow posts about that, I'll include some of them:
https://stackoverflow.com/questions/63874265/keras-custom-loss-function-error-no-gradients-provided
https://stackoverflow.com/questions/73197501/raise-valueerror-no-gradients-provided-for-any-variables-custom-loss-function
https://stackoverflow.com/questions/59292992/tensorflow-2-custom-loss-no-gradients-provided-for-any-variable-error
https://stackoverflow.com/questions/65619581/no-gradients-provided-for-any-variable-for-custom-loss-function
https://stackoverflow.com/questions/70537503/custom-loss-function-error-valueerror-no-gradients-provided-for-any-variable
https://datascience.stackexchange.com/questions/116645/custom-loss-function-for-binary-classificatio-in-keras-gets-error-no-gradients
https://stackoverflow.com/questions/74074934/error-no-gradients-provided-for-any-variable-while-using-custom-loss
https://stackoverflow.com/questions/75738678/gradienttape-returning-none-with-custom-csi-loss-function
https://stackoverflow.com/questions/72259489/valueerror-no-gradients-provided-for-any-variable-custom-loss-function
...
Will this change the current api? How?
This will change the current API by adding some checks on loss function before starting to training, an error/warning can be thrown.
Contributing
The text was updated successfully, but these errors were encountered: