Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ImportError: cannot import name 'is_flash_attn_available' from 'transformers.utils' (~/lib/python3.10/site-packages/transformers/utils/__init__.py) #27319

Closed
2 of 4 tasks
Rajmehta123 opened this issue Nov 6, 2023 · 7 comments · Fixed by #27330

Comments

@Rajmehta123
Copy link

Rajmehta123 commented Nov 6, 2023

System Info

transformers: 4.35.0
python: 3.10.13
Platform: Linux

Who can help?

@Narsil @ArthurZucker @SunMarc

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

from transformers import AutoModelForCausalLM, AutoTokenizer, pipeline
import torch
model = AutoModelForCausalLM.from_pretrained("01-ai/Yi-6B-200K", trust_remote_code=True,torch_dtype=torch.bfloat16, device_map="auto")
tokenizer = AutoTokenizer.from_pretrained("01-ai/Yi-6B-200K",trust_remote_code=True)
pipe = pipeline("text-generation", model=model, tokenizer=tokenizer)

This works fine with 4.34.1 transformers verion

Expected behavior

The model gets loaded correctly

BUT I got this error:

ImportError: cannot import name 'is_flash_attn_available' from 'transformers.utils' (~/lib/python3.10/site-packages/transformers/utils/init.py)

This works fine with 4.34.1 transformers verion

@SunMarc
Copy link
Member

SunMarc commented Nov 6, 2023

Hi @Rajmehta123 ! Thanks for reporting this issue. This happens because in the latest version of transformers, we removed is_flash_attn_available() in favor of is_flash_attn_2_available(). See related PR. I guess the remote code that you are executing is using is_flash_attn_available, hence the error. I've opened an PR to put back the function and make it go through a deprecation cycle !

@regineshalom
Copy link

regineshalom commented Nov 14, 2023

Hi @SunMarc , I have an error as well but its on the flip side

from transformers import pipeline
classifier = pipeline('summarization')

RuntimeError: Failed to import transformers.models.bart.modeling_bart because of the following error (look up to see its traceback):
cannot import name 'is_flash_attn_2_available' from 'transformers.utils'

Transformers: Version: 4.35.0
Python: 3.11.4

@SunMarc
Copy link
Member

SunMarc commented Nov 14, 2023

Hi @regineshalom, i'm unable to reproduce your error on my local setup. If you can reproduce the error in a colab and send it to me, that would be great.

@yecphaha
Copy link

yecphaha commented Jan 25, 2024

ImportError: cannot import name 'is_flash_attn_available' from 'transformers.utils'
答:pip install transformers==4.34.1,transformers的版本必须是4.34,不能是4.31、4.32、4.33

image

@curlup
Copy link
Contributor

curlup commented Feb 22, 2024

I got the same cannot import name 'is_flash_attn_2_available' from 'transformers.utils
What version i downgrade to to get it working? thanks

@ArthurZucker
Copy link
Collaborator

I think upgrading would be better 😉

@imneov
Copy link

imneov commented Apr 12, 2024

I got the same cannot import name 'is_flash_attn_2_available' from 'transformers.utils What version i downgrade to to get it working? thanks

# Install the specific version using pip
pip install transformers==4.34.1

@curlup It work fine to me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

7 participants