Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Respect torch.use_deterministic_algorithms #7579

Closed
ezyang opened this issue May 12, 2023 · 2 comments · Fixed by #7582
Closed

Respect torch.use_deterministic_algorithms #7579

ezyang opened this issue May 12, 2023 · 2 comments · Fixed by #7582

Comments

@ezyang
Copy link
Contributor

ezyang commented May 12, 2023

🐛 Describe the bug

Some of torchvision's backward CUDA operators (e.g., roi_align_backward) use gpu atomic add, which makes the nondeterministic. It would be great if they were coded to respect use_deterministic. The easy thing to do is just make them warn, but if we can implement a Python version of the function we can also do the same playbook as in pytorch/pytorch#101115

cc @kurtamohler

Versions

master

@NicolasHug
Copy link
Member

Could be a way of addressing #7457

The non-determinism has been around forever - @ezyang is there a time-critical need to fix it now?

@ezyang
Copy link
Contributor Author

ezyang commented May 12, 2023

I'll submit the patch. I do kind of need it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants