We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Any OS system that can run transformers
related issue: vllm-project/vllm#6224
No response
examples
transformers (4.42.3) and got the issue
from transformers import AutoTokenizer tokenizer = AutoTokenizer.from_pretrained("llava-hf/llava-v1.6-34b-hf") print(tokenizer.encode("<image>")) print(tokenizer.vocab_size)
output
[64003] 64000
Can confirm transformers (4.40.1) does not have this issue
[64000] 64000
It is supposed to be the same?
The text was updated successfully, but these errors were encountered:
After trying this code on a few different versions, it looks like this got changed between 4.41.2 and 4.42.0.
Sorry, something went wrong.
Also, this issue only occurs for llava-hf/llava-v1.6-34b-hf. It works fine for llava-hf/llava-v1.6-mistral-7b-hf and llava-hf/llava-v1.6-vicuna-7b-hf.
llava-hf/llava-v1.6-34b-hf
llava-hf/llava-v1.6-mistral-7b-hf
llava-hf/llava-v1.6-vicuna-7b-hf
Update: I think this is the same issue as #31713.
Answered in #31713 (comment)
Closing because #31902 was merged
No branches or pull requests
System Info
Any OS system that can run transformers
related issue: vllm-project/vllm#6224
Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
transformers (4.42.3) and got the issue
output
Can confirm transformers (4.40.1) does not have this issue
Expected behavior
It is supposed to be the same?
The text was updated successfully, but these errors were encountered: