We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ValueError: The current architecture does not support Flash Attention 2.0. Please open an issue on GitHub to request support for this architecture: https://github.com/huggingface/transformers/issues/new
The text was updated successfully, but these errors were encountered:
cc @younesbelkada :)
Sorry, something went wrong.
I believe this is actually a duplicate of #26443, closing in favor of this one
Hi @nirdoshrawal009 Please take a look at my comment here: #26443 (comment) For more details
No branches or pull requests
ValueError: The current architecture does not support Flash Attention 2.0. Please open an issue on GitHub to request support for this architecture: https://github.com/huggingface/transformers/issues/new
The text was updated successfully, but these errors were encountered: