-
Notifications
You must be signed in to change notification settings - Fork 19.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Check differentiability of custom loss function before training #17753
Check differentiability of custom loss function before training #17753
Conversation
will continue the discussion on the issue keras-team/tf-keras#52 to re-scope this PR. |
@haifeng-jin Can you take a look at it again? This now supports custom layers aswell. While checking custom layers (can throw any error while checking), it uses nested try-except blocks which may not be the best practice. |
Hi @haifeng-jin, Is there an update for this? I see the questions of new users (new to Keras) tries to use a non-differentiable loss function every two weeks in Stackoverflow. |
Need an review from @qlzh727 since I do not have enough knowledge to review this PR. |
Hello, Thank you for submitting a pull request. We're currently in the process of migrating the new |
Feature request was made in keras-team/tf-keras#52.
This is a very common mistake when users define custom loss function / classes which are not differentiable which leads getting
None
gradients in fitting process. This check makes it easier for users to interpret the problem.Example usage:
Raises:
You can see the other usages from this gist.