Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

flash_attn 2.5.9.post1 fails with ImportError on Python 3.10 #1027

Open
Laz4rz opened this issue Jul 4, 2024 · 0 comments
Open

flash_attn 2.5.9.post1 fails with ImportError on Python 3.10 #1027

Laz4rz opened this issue Jul 4, 2024 · 0 comments

Comments

@Laz4rz
Copy link

Laz4rz commented Jul 4, 2024

Trying to import flash_attn with:

python: 3.10.14 (main, May  6 2024, 19:42:50) [GCC 11.2.0]
transformers: 4.42.3
torch: 2.3.1+cu121
flash_attn 2.5.9.post1

Fails with ImportError:

ImportError: /home/mikolaj/miniconda3/envs/gemma/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN2at4_ops5zeros4callEN3c108ArrayRefINS2_6SymIntEEENS2_8optionalINS2_10ScalarTypeEEENS6_INS2_6LayoutEEENS6_INS2_6DeviceEEENS6_IbEE

Current fix is to change the Python to 3.11 or 3.12.

YumaTsuta added a commit to llm-jp/modelwg that referenced this issue Jul 9, 2024
- Added 'setuptools<70' to requirements to ensure compatibility with packaging library (see: aws-neuron/aws-neuron-sdk#893)
- Added version restriction "flash-attn!=2.5.9.post1" to resolve installation issues with Python 3.10 (see: Dao-AILab/flash-attention#1027)
- Removed obsolete file copy commands for non-existent files
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant