-
Notifications
You must be signed in to change notification settings - Fork 27.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ImportError: cannot import name 'is_flash_attn_available' from 'transformers.utils' (~/lib/python3.10/site-packages/transformers/utils/__init__.py) #27319
Comments
Hi @Rajmehta123 ! Thanks for reporting this issue. This happens because in the latest version of transformers, we removed |
Hi @SunMarc , I have an error as well but its on the flip side from transformers import pipeline RuntimeError: Failed to import transformers.models.bart.modeling_bart because of the following error (look up to see its traceback): Transformers: Version: 4.35.0 |
Hi @regineshalom, i'm unable to reproduce your error on my local setup. If you can reproduce the error in a colab and send it to me, that would be great. |
I got the same |
I think upgrading would be better 😉 |
@curlup It work fine to me. |
System Info
transformers: 4.35.0
python: 3.10.13
Platform: Linux
Who can help?
@Narsil @ArthurZucker @SunMarc
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
import torch
model = AutoModelForCausalLM.from_pretrained("01-ai/Yi-6B-200K", trust_remote_code=True,torch_dtype=torch.bfloat16, device_map="auto")
tokenizer = AutoTokenizer.from_pretrained("01-ai/Yi-6B-200K",trust_remote_code=True)
pipe = pipeline("text-generation", model=model, tokenizer=tokenizer)
This works fine with 4.34.1 transformers verion
Expected behavior
The model gets loaded correctly
BUT I got this error:
ImportError: cannot import name 'is_flash_attn_available' from 'transformers.utils' (~/lib/python3.10/site-packages/transformers/utils/init.py)
This works fine with 4.34.1 transformers verion
The text was updated successfully, but these errors were encountered: