Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support default padding case for tosa::AvgPool in the presence of count_include_pad #3868

Merged
merged 1 commit into from
Nov 12, 2024

Conversation

Hanumanth04
Copy link
Contributor

@Hanumanth04 Hanumanth04 commented Nov 12, 2024

Essentially, as part of my earlier change , I didn't consider the padding value while erroring out for unsupported count_include_pad during torch-to-tosa lowering for AvgPool2d. The fix captured in this change addresses this. Please see issue for more details on this.

@Hanumanth04 Hanumanth04 changed the title Change to support default padding case for tosa::AvgPool in the prese… Support default padding case for tosa::AvgPool in the presence of count_include_pad Nov 12, 2024
@Hanumanth04
Copy link
Contributor Author

Hi @sjarus, @dbabokin, @sahas3, @justin-ngo-arm, @rafaelubalmw, could you please take a look at this change?

@justin-ngo-arm
Copy link
Contributor

This looks reasonable to me. @sjarus, do you have any further comments?

Copy link
Contributor

@dbabokin dbabokin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me!

@sjarus
Copy link
Collaborator

sjarus commented Nov 12, 2024

Thanks a lot, @Hanumanth04! I'll go ahead and merge.

@sjarus sjarus merged commit 30c5193 into llvm:main Nov 12, 2024
3 checks passed
@Hanumanth04 Hanumanth04 deleted the fixTosaAvg2dpoolFallouts branch November 13, 2024 11:20
rahuls-cerebras added a commit that referenced this pull request Jan 3, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants