Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[chore] Update compatibility to a recent Triton #483

Merged
merged 4 commits into from
Nov 14, 2022

Conversation

blefaudeux
Copy link
Contributor

@blefaudeux blefaudeux commented Oct 15, 2022

What does this PR do?

  • Moves all the triton layers to a more recent release of OpenAI Triton. Not all of them make sense by now, given nvfuser or pytorch perf improvements, dropping some of them would be fine by me but at least this PR will make an xformer install alongside some recent triton code easier
  • Updates all the plots to reflect a newer pytorch and a newer triton
  • Added the StarReLU activation (from Metaformer baselines), because why not, check extensibility

Before submitting

  • Did you have fun?
    • Make sure you had fun coding 🙃
  • Did you read the contributor guideline?
  • Was this discussed/approved via a Github issue? (no need for typos, doc improvements)
    • N/A
  • Did you make sure to update the docs?
    • N/A
  • Did you write any new necessary tests?
    • N/A
  • Did you update the changelog? (if needed)
    • N/A

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Oct 15, 2022
@blefaudeux blefaudeux marked this pull request as draft October 15, 2022 14:06
@blefaudeux blefaudeux mentioned this pull request Oct 15, 2022
18 tasks
@blefaudeux blefaudeux force-pushed the triton_update branch 3 times, most recently from 48c8d23 to 17722fb Compare October 24, 2022 20:27
@danthe3rd
Copy link
Contributor

Oh that's great! Thanks for the contribution! I gave it a try in #521 by simply changing the requirements.txt file, but it looks like there is more to it

adding a basic triton random check
switching all the asserts in triton fused linear layer to triton's
@blefaudeux blefaudeux force-pushed the triton_update branch 2 times, most recently from f6d08e3 to f4cb19f Compare November 12, 2022 21:59
@blefaudeux
Copy link
Contributor Author

Oh that's great! Thanks for the contribution! I gave it a try in #521 by simply changing the requirements.txt file, but it looks like there is more to it

yes, some of the interface changed (you cannot compose kernels anymore for instance). It should work with more recent versions I believe, this is post-breaking changes

@codecov-commenter
Copy link

codecov-commenter commented Nov 12, 2022

Codecov Report

Base: 88.05% // Head: 88.02% // Decreases project coverage by -0.03% ⚠️

Coverage data is based on head (6b89dac) compared to base (8367685).
Patch coverage: 95.45% of modified lines in pull request are covered.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #483      +/-   ##
==========================================
- Coverage   88.05%   88.02%   -0.04%     
==========================================
  Files          80       80              
  Lines        4823     4817       -6     
==========================================
- Hits         4247     4240       -7     
- Misses        576      577       +1     
Flag Coverage Δ
Python 88.02% <95.45%> (-0.04%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
xformers/triton/dropout.py 92.64% <90.90%> (-2.02%) ⬇️
xformers/components/activations.py 100.00% <100.00%> (ø)
xformers/triton/fused_linear_layer.py 100.00% <100.00%> (ø)

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

☔ View full report at Codecov.
📢 Do you have feedback about the report comment? Let us know in this issue.

Copy link
Contributor

@danthe3rd danthe3rd left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot! Happy to review/merge once you think it's ready :)

xformers/triton/k_dropout.py Outdated Show resolved Hide resolved
@blefaudeux blefaudeux marked this pull request as ready for review November 13, 2022 21:26
@blefaudeux
Copy link
Contributor Author

ouch, the latest update doesn't work on a T4 used for CI, too old.. I'll fix that later when I get the time :(

@blefaudeux blefaudeux marked this pull request as draft November 14, 2022 08:28
@danthe3rd
Copy link
Contributor

ouch, the latest update doesn't work on a T4 used for CI, too old.. I'll fix that later when I get the time :(

Do you think we can merge it without the latest changes? As this is currently blocking users from installing xformers with triton at all

@blefaudeux
Copy link
Contributor Author

ouch, the latest update doesn't work on a T4 used for CI, too old.. I'll fix that later when I get the time :(

Do you think we can merge it without the latest changes? As this is currently blocking users from installing xformers with triton at all

it's just a dimension preset on the fused linear kernel, I'll update that

@blefaudeux blefaudeux marked this pull request as ready for review November 14, 2022 19:59
@blefaudeux blefaudeux changed the title [DRAFT] Update compatibility to a recent Triton [chore] Update compatibility to a recent Triton Nov 14, 2022
Copy link
Contributor

@danthe3rd danthe3rd left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for maintaining the triton part of the code and updating it!
Really appreciated :)

@danthe3rd danthe3rd merged commit d647bb5 into facebookresearch:main Nov 14, 2022
bertmaher pushed a commit to bertmaher/xformers that referenced this pull request Dec 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants