PR #18105: [Add Feature] - Throw an error if softmax is used with 1 neuron #18264
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
PR #18105: [Add Feature] - Throw an error if softmax is used with 1 neuron
Imported from GitHub PR #18105
This is a utility function to check if the usage of softmax makes sense (new users make this mistake a lot). Applying softmax on a single neuron will make the model output ones everytime, there are too many Stackoverflow posts about this.
In order to see this in action, please check the gist.
This applies for any other layers (Conv2D etc.) where the applied axis (axis=-1 default) of softmax has only one unit.
Copybara import of the project:
--
90c95b1 by Kaan Bıçakcı 46622558+Frightera@users.noreply.github.com:
Add last layer activation check for softmax
--
1cedb20 by Kaan Bıçakcı 46622558+Frightera@users.noreply.github.com:
Split logic for sequential and functional models
--
529f968 by Kaan Bıçakcı 46622558+Frightera@users.noreply.github.com:
Add tests for _check_last_layer_activation
--
d1acddb by Kaan Bıçakcı 46622558+Frightera@users.noreply.github.com:
Update sequential check
--
8363016 by Kaan Bıçakcı 46622558+Frightera@users.noreply.github.com:
Update tests, logic and reformatting
--
ebf16c3 by Kaan Bıçakcı 46622558+Frightera@users.noreply.github.com:
Update tests and the logic
--
afc156a by Kaan Bıçakcı 46622558+Frightera@users.noreply.github.com:
Make validate_softmax_activation experimental
--
3a228fb by Kaan Bıçakcı 46622558+Frightera@users.noreply.github.com:
Fix edge case for _validate_softmax_output
--
e9c950e by Kaan Bıçakcı 46622558+Frightera@users.noreply.github.com:
Check the softmax axis and raise an error if relevant
--
6355b23 by Kaan Bıçakcı 46622558+Frightera@users.noreply.github.com:
Update softmax check tests
--
a6745ee by Kaan Bıçakcı 46622558+Frightera@users.noreply.github.com:
Minor typo fix
--
92281f6 by Kaan Bıçakcı 46622558+Frightera@users.noreply.github.com:
Fix test fails for _check_output_activation_softmax
--
72a035f by Kaan Bıçakcı 46622558+Frightera@users.noreply.github.com:
Resolve conflict
--
0af059c by Kaan Bıçakcı 46622558+Frightera@users.noreply.github.com:
Squashed commit master (merge) to resolve conflicts
--
065cdea by Kaan Bıçakcı 46622558+Frightera@users.noreply.github.com:
Revert "Squashed commit master (merge) to resolve conflicts"
This reverts commit 0af059c.
--
446f1dd by Kaan Bıçakcı 46622558+Frightera@users.noreply.github.com:
Remove steps_per_execution_tuning from imports
--
1fbd931 by Kaan Bıçakcı 46622558+Frightera@users.noreply.github.com:
Fix lint
--
2c867f8 by Kaan Bıçakcı 46622558+Frightera@users.noreply.github.com:
Update TestCheckLastLayerActivation tests
Merging this change closes #18105
FUTURE_COPYBARA_INTEGRATE_REVIEW=#18105 from Frightera:last_layer_softmax_warn 2c867f8