Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Native WebGPU EP] Add packedQKV and do_rotary attribute support to GroupQueryAttention operator #23386

Open
wants to merge 21 commits into
base: main
Choose a base branch
from

Conversation

satyajandhyala
Copy link
Contributor

Description

Add Packed QKV inputs and do_rotary attribute to GQA.

Motivation and Context

Packed QKV inputs and do_rotary attribute are required for certain models.

Copy link
Contributor

@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can commit the suggested changes from lintrunner.

@guschmue guschmue added the ep:WebGPU ort-web webgpu provider label Jan 16, 2025
@satyajandhyala satyajandhyala marked this pull request as ready for review February 20, 2025 01:46
Copy link
Contributor

@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can commit the suggested changes from lintrunner.

@guschmue
Copy link
Contributor

this needs to be merged with main to resolve some conflicts with fa2 and we need to add a condition to not call fa2. something like:

if (!do_rotary_ && CanApplyFlashAttention

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ep:WebGPU ort-web webgpu provider
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants