Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[PyTorch] Fix rsub type #10090

Merged
merged 2 commits into from
Jan 28, 2022
Merged

[PyTorch] Fix rsub type #10090

merged 2 commits into from
Jan 28, 2022

Conversation

comaniac
Copy link
Contributor

A quick patch to fix rsub conversion in PyTorch. The original implementation use float(inputs[2]) for alpha, which implies both data0 data1 must be float32. As a result, I got type error when converting a FP16 model.

cc @t-vi @masahi

@masahi masahi merged commit e6af874 into apache:main Jan 28, 2022
@comaniac comaniac deleted the fix_pt_rsub branch January 28, 2022 17:54
sunggg pushed a commit to sunggg/tvm that referenced this pull request Jan 29, 2022
* [PyTorch] Fix rsub type

* fix
ylc pushed a commit to ylc/tvm that referenced this pull request Feb 16, 2022
* [PyTorch] Fix rsub type

* fix
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants