Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add mixed precision support #46

Merged
merged 6 commits into from
Jul 29, 2022
Merged

Conversation

julienperichon
Copy link
Collaborator

Context

Some Sicara project using Keras FSL was needing support for mixed precision, which was not possible for some classes until now.

Changes

  • Updated ClippedBinaryCrossentropy loss to cast float parameters to loss dtype when computations are done in type other than float32
  • Update LearntNorms to use float32 precision in last activation layer when using mixed precision

To Do after this

  • Update all repository for mixed precision support

@codecov
Copy link

codecov bot commented Jul 29, 2022

Codecov Report

Merging #46 (4c67235) into master (149dd31) will increase coverage by 0.65%.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master      #46      +/-   ##
==========================================
+ Coverage   65.93%   66.59%   +0.65%     
==========================================
  Files          44       44              
  Lines         963      982      +19     
==========================================
+ Hits          635      654      +19     
  Misses        328      328              
Impacted Files Coverage Δ
keras_fsl/losses/gram_matrix_losses.py 100.00% <100.00%> (ø)
keras_fsl/losses/tests/gram_matrix_losses_test.py 100.00% <100.00%> (ø)
keras_fsl/models/head_models/learnt_norms.py 100.00% <100.00%> (ø)
..._fsl/models/head_models/tests/learnt_norms_test.py 96.15% <100.00%> (+1.41%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 149dd31...4c67235. Read the comment docs.

@julienperichon julienperichon force-pushed the add-mixed-precision-support branch 5 times, most recently from a2856b4 to 4643d2a Compare July 29, 2022 14:32
@julienperichon julienperichon force-pushed the add-mixed-precision-support branch from 4643d2a to 79ec9c7 Compare July 29, 2022 14:56
@lmontier
Copy link
Collaborator

Peux tu mettre le lien vers la branch où le calcul de la mixed precision a fonctionné?

@julienperichon julienperichon merged commit 0f176f8 into master Jul 29, 2022
@julienperichon julienperichon deleted the add-mixed-precision-support branch July 29, 2022 15:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants