Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

T5 Base length of Tokenizer not equal config vocab_size #10144

Closed
ari9dam opened this issue Feb 11, 2021 · 2 comments
Closed

T5 Base length of Tokenizer not equal config vocab_size #10144

ari9dam opened this issue Feb 11, 2021 · 2 comments
Assignees

Comments

@ari9dam
Copy link

ari9dam commented Feb 11, 2021

Environment info

  • transformers version: Installed from git

Issue

The len(AutoTokenizer.from_pretrained("t5-base")) is 32100 but the T5ForConditionalGeneration.from_pretrained("t5-base").config.vocab_size is 32128. Seems to be a similar issue to that of : #2020

@patrickvonplaten
Copy link
Contributor

duplicate of #4875 I think

@github-actions
Copy link

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

Please note that issues that do not follow the contributing guidelines are likely to be ignored.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants