Adding Flash Attention 2 Support for GPT2#29226
Merged
amyeroberts merged 33 commits intohuggingface:mainfrom EduardoPach:add-flash-attn-gpt2Mar 28, 2024
+377-25
Commits
Commits on Feb 22, 2024
Commits on Feb 23, 2024
Commits on Feb 24, 2024
- committed
Commits on Mar 1, 2024
- committed
- committed
Commits on Mar 4, 2024
Commits on Mar 13, 2024
- committed
Commits on Mar 14, 2024
Commits on Mar 15, 2024
- committed
- committed
- committed
Merge branch 'add-flash-attn-gpt2' of https://github.com/EduardoPach/transformers into add-flash-attn-gpt2
committed- committed