We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Trying to import flash_attn with:
python: 3.10.14 (main, May 6 2024, 19:42:50) [GCC 11.2.0] transformers: 4.42.3 torch: 2.3.1+cu121 flash_attn 2.5.9.post1
Fails with ImportError:
ImportError: /home/mikolaj/miniconda3/envs/gemma/lib/python3.10/site-packages/flash_attn_2_cuda.cpython-310-x86_64-linux-gnu.so: undefined symbol: _ZN2at4_ops5zeros4callEN3c108ArrayRefINS2_6SymIntEEENS2_8optionalINS2_10ScalarTypeEEENS6_INS2_6LayoutEEENS6_INS2_6DeviceEEENS6_IbEE
Current fix is to change the Python to 3.11 or 3.12.
The text was updated successfully, but these errors were encountered:
fix: Resolve pip install errors by adjusting dependencies and cleanup
b9ea286
- Added 'setuptools<70' to requirements to ensure compatibility with packaging library (see: aws-neuron/aws-neuron-sdk#893) - Added version restriction "flash-attn!=2.5.9.post1" to resolve installation issues with Python 3.10 (see: Dao-AILab/flash-attention#1027) - Removed obsolete file copy commands for non-existent files
No branches or pull requests
Trying to import flash_attn with:
Fails with ImportError:
Current fix is to change the Python to 3.11 or 3.12.
The text was updated successfully, but these errors were encountered: