-
Notifications
You must be signed in to change notification settings - Fork 59
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SageMaker fails because Conversation object is not found #129
Comments
I have the same problem when running SageMaker specifying transformers 4.44
|
I tried specifying transformers
Specifying |
I have also encountered this issue when trying to update to the most recent version of transformers and continue making use of the sagemaker-huggingface-inference-toolkit. Is the simple fix for now to simply remove the I'm more concerned that this highlights a lack of maintenance and / or wider usage of this package across the community. What is the recommended way to host HF models for inference in SageMaker that does not make use of the sagemaker-huggingface-inference-toolkit? |
@ed-berry I'm still having this issue if I use the latest version of transformers (4.44.2), and I get other errors if I use transformers 4.41.2. I see your PR, but it still hasn't been merged. @philschmid has approved it, but it currently says "This workflow requires approval from a maintainer." I has been a week since the last action. Do you know how we can bring this to the attention of the maintainers? |
Hey @joann-alvarez. That PR will need merging and a new release of the inference toolkit creating to resolve the issue if you aren't able to use an older version of transformers. Yeah, one of the maintainers needs to approve a run of the GitHub workflows that check the PR before it can be merged. I'm not sure how we flag that beyond creating a PR and this issue though. Your best bet might be trying to get an older version of transformers working for now. |
Hi, how do I use the latest version? I'm trying to deploy FLUX to Sagemaker but I'm facing this issue when installing |
Is there any update or temporary solution to this? I can't pin it to a specific lower version as sagemaker would complain about it |
Can you try adding temporary install/add |
@philschmid I just tried adding that. I still get this error since I've been getting since August:
It also seems that |
Pinning transformers==4.41.2 is suboptimal in our case. Setting huggingface-hub==0.25.2 also didn't help. A fix would be great! |
I took these steps, and I think one of them ended up fixing it. Get the latest version of I'm not sure which of these fixed it. |
Hi @joann-alvarez, which image did you use? |
763104351884.dkr.ecr.us-west-1.amazonaws.com/huggingface-pytorch-inference:2.1.0-transformers4.37.0-gpu-py310-cu118-ubuntu20.04 EDIT: It was the "west" version, not the east. I changed the location above. |
We are hosting a model in SageMaker and today we observed the following error in our logs when the model was being relaunched in the instance:
I found someone reported a similar issue in https://discuss.huggingface.co/t/cannot-import-conversation-from-transformers-utils-py/91556. When digging into the changes in the
transformers
dependency I found this change regarding theConversation
object: huggingface/transformers#31165Based in our logs, the last time the model was successfully relaunched in our SageMaker infrastructure (2 days ago), the version of the package that was downloaded was
transformers-4.41.2-py3-none-any.whl
, but when the error started to be observed, the downloaded version wastransformers-4.42.1-py3-none-any.whl
.According to the pull request mentioned above, the change is effective as per version
4.42
, which was released 18 hours ago at the moment of writing this issue.I think a change is needed in
src/sagemaker_huggingface_inference_toolkit/transformers_utils.py
to reflect the new structure of the code in thetransformers
dependency or pin the dependency version to version4.41.2
to prevent the issue in the future.In our case, we are updating our
requirements.txt
file to pin the version4.41.2
, but other users might not be aware of what is happening, therefore they are not aware of this fix.The text was updated successfully, but these errors were encountered: