-
Notifications
You must be signed in to change notification settings - Fork 494
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ImportError using Llama 2 with BetterTransformer #1481
Comments
cc @fxmarty this function is private and was removed in huggingface/transformers#26792! |
@ArthurZucker thanks for the reply (and for all your and the HF team's work on the code!!). Does this mean the solution is to reinstall from git? (not sure how long PRs take to percolate into the version pip installs from git, or if it is instant) |
Worst case we can also leave and deprecate it in |
We'll deprecate it in Transformers: huggingface/transformers#27074 (comment) |
It is private, I'll fix. |
For convenience sake, I deprecated the functions now in huggingface/transformers#27074 (comment) |
Please use directly transformers>=4.36 with torch>=2.1.1 to benefit from PyTorch SDPA optimizations by default. BetterTransformer for Llama is deprecated: https://huggingface.co/docs/transformers/perf_infer_gpu_one#flashattention-and-memory-efficient-attention-through-pytorchs-scaleddotproductattention |
System Info
I'm attempting to fine-tune a Llama2 model. My training loop works fine without BetterTransformers, but when I attempt to import the module it looks like there is an ImportError:
ImportError: cannot import name '_expand_mask' from 'transformers.models.llama.modeling_llama'
.I already installed Transformers from git as described in this issue.
Version info:
transformers @ git+https://github.com/huggingface/transformers@32f799db0d625ec5cf82624ff2604c5a891ebf61
optimum==1.13.2
Python 3.8.16
Full stack trace is below. Any suggestions? Thanks!
This is probably a question for @ArthurZucker based on his answer to the post linked above!
Who can help?
@ArthurZucker
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Installing the versions above and importing BetterTransformers should reproduce the issue but I haven't tried it.
Expected behavior
I expect to import BetterTransformer without errors.
The text was updated successfully, but these errors were encountered: