Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bf16 & tf32,can they be used together #252

Open
DSXiangLi opened this issue May 3, 2023 · 4 comments
Open

bf16 & tf32,can they be used together #252

DSXiangLi opened this issue May 3, 2023 · 4 comments

Comments

@DSXiangLi
Copy link

In the hyperparameter, I saw the bf16 and tf32 are both set to true. However I don't they can be used together?

@hujunchao
Copy link

I have the same question

@dotsnangles
Copy link

dotsnangles commented May 7, 2023

I was searching for the internet to find the answer. and then I found this thread.
is it possible though? I think the latter arg probably overwrites the former.
if someone could clarity this, I would be grateful.

this thread might help: huggingface/transformers#14608

@yxchng
Copy link

yxchng commented Jun 25, 2023

any new update on this?

@tdolega
Copy link

tdolega commented Feb 23, 2024

By setting bf16, the model weights are saved in bf16, but gradients (computed in half-precision but converted to full-precision for the update) and optimizations (optimizer states) are still done in full 32-bit precision. By additionally setting tf32, you speed up that very calculations at the expense of precision, but since ultimately those weights are saved as bf16 anyway, it probably has no effect on accuracy.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants