Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not able to use FP16 in pytorch-pretrained-BERT #139

Closed
Ashish-Gupta03 opened this issue Dec 20, 2018 · 0 comments
Closed

Not able to use FP16 in pytorch-pretrained-BERT #139

Ashish-Gupta03 opened this issue Dec 20, 2018 · 0 comments

Comments

@Ashish-Gupta03
Copy link

I'm not able to work with FP16 for pytorch BERT code. Particularly for BertForSequenceClassification, which I tried and got the issue
Runtime error: Expected scalar type object Half but got scalar type Float for argument #2 target
when I enabled fp16.
Also when using
logits = logits.half() labels = labels.half()
then the epoch time also increased.

Originally posted by @Ashish-Gupta03 in https://github.com/huggingface/pytorch-pretrained-BERT/issue_comments#issuecomment-449096213

ocavue pushed a commit to ocavue/transformers that referenced this issue Sep 13, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants