-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[AutoScheduler] Support layout rewrite for whole networks #6987
Conversation
cbf6dd9
to
dad8cc7
Compare
ebabcb4
to
54b9e51
Compare
8129203
to
b4e4278
Compare
b4e4278
to
082dcc9
Compare
51d057a
to
3d492b3
Compare
f23ea6a
to
ebbfbd4
Compare
@jcf94 @comaniac @FrozenGene @junrushao1994 Let us merge this as soon as possible, so we can get good performance on CPU and publish the CPU tutorial. |
ebbfbd4
to
8691c8a
Compare
@@ -371,8 +379,29 @@ def conv2d_nhwc(Input, Filter, stride, padding, dilation, out_dtype="float32"): | |||
else: | |||
dilation_h, dilation_w = dilation | |||
|
|||
if auto_scheduler_rewritten_layout: | |||
# Infer shape for the rewritten layout |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As discussed, we need extract it. If conveniently, we could mark this one todo.
|
||
with tempfile.NamedTemporaryFile() as fp: | ||
log_file = fp.name | ||
# log_file = "test_layout_rewrite.json" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
remove it. If we want to merge it soon, could leave it in the clean up pr.
8691c8a
to
ebbda97
Compare
ebbda97
to
d3c69ca
Compare
* [AutoScheduler] Add layout rewrite pass in relay * fix * fix lint * fix attrs * trigger CI * Apply suggestions from code review * trigger CI * Update python/tvm/auto_scheduler/relay_integration.py * Update python/tvm/auto_scheduler/relay_integration.py * Update python/tvm/auto_scheduler/compute_dag.py * Trigger CI * Apply suggestions from code review
* [AutoScheduler] Add layout rewrite pass in relay * fix * fix lint * fix attrs * trigger CI * Apply suggestions from code review * trigger CI * Update python/tvm/auto_scheduler/relay_integration.py * Update python/tvm/auto_scheduler/relay_integration.py * Update python/tvm/auto_scheduler/compute_dag.py * Trigger CI * Apply suggestions from code review
* [AutoScheduler] Add layout rewrite pass in relay * fix * fix lint * fix attrs * trigger CI * Apply suggestions from code review * trigger CI * Update python/tvm/auto_scheduler/relay_integration.py * Update python/tvm/auto_scheduler/relay_integration.py * Update python/tvm/auto_scheduler/compute_dag.py * Trigger CI * Apply suggestions from code review
Auto-scheduler can infer a good layout for constant tensors (e.g., the weight tensors in conv2d, dense) according to the loop structure created by the schedule. This PR adds a relay pass to support this kind of layout rewrite at graph level. This pass inserts necessary layout transforms in relay program and pre-computes them by
FoldConstant
pass.Changes overview
auto_scheduler_layout_transform
and its topi compute definition. This op can handle more general layout transform than the existingrelay.op.transform.layout_transform
AutoSchedulerLayoutRewrite
to do the layout rewrite.auto_scheduler_rewritten_layout
toConv2dAttrs
to indicate the new layout after rewriting.python/tvm/topi/nn/conv2d.py
for examples of how to enable the layout rewrite for a TOPI compute.co-authored by @minminsun