-
Notifications
You must be signed in to change notification settings - Fork 6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Data] Enable optimizer by default #34937
Conversation
Signed-off-by: Scott Lee <sjl@anyscale.com>
Signed-off-by: Scott Lee <sjl@anyscale.com>
Signed-off-by: Scott Lee <sjl@anyscale.com>
Signed-off-by: Scott Lee <sjl@anyscale.com>
Signed-off-by: Scott Lee <sjl@anyscale.com>
Signed-off-by: Scott Lee <sjl@anyscale.com>
Signed-off-by: Scott Lee <sjl@anyscale.com>
Signed-off-by: Scott Lee <sjl@anyscale.com>
Signed-off-by: Scott Lee <sjl@anyscale.com>
Signed-off-by: Scott Lee <sjl@anyscale.com>
Signed-off-by: Scott Lee <sjl@anyscale.com>
Signed-off-by: Scott Lee <sjl@anyscale.com>
Signed-off-by: Scott Lee <sjl@anyscale.com>
Signed-off-by: Scott Lee <sjl@anyscale.com>
Signed-off-by: Scott Lee <sjl@anyscale.com>
Signed-off-by: Scott Lee <sjl@anyscale.com>
Signed-off-by: Scott Lee <sjl@anyscale.com>
Signed-off-by: Scott Lee <sjl@anyscale.com>
Signed-off-by: Hao Chen <chenh1024@gmail.com>
Signed-off-by: Hao Chen <chenh1024@gmail.com>
Signed-off-by: Hao Chen <chenh1024@gmail.com>
python/ray/data/__init__.py
Outdated
# Used for caching user-defined callable classes. | ||
# Key the class, value is the object. | ||
# see make_callable_class_concurrent in python/ray/data/_internal/execution/util.py. | ||
# The reason why this is a dict is because we may fuse multiple map operators into one. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do we actually fuse multiple actors into one? I don't think we do that / should do that.
Using a dict is a bit concerning since we could leak closures over time, compared to a singleton that is overwritten.
input_files=[], | ||
exec_stats=None, | ||
), | ||
read_task.get_metadata(), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you make sure this calls cleaned_metadata(read_task)
in legacy_compat.py to implement the same logic?
_assert_has_stages(ds._plan._last_optimized_stages, ["ReadRange->Map"]) | ||
|
||
|
||
def test_optimize_reorder(ray_start_regular_shared): | ||
# The ReorderRandomizeBlocksRule optimizer rule collapses RandomizeBlocks operators, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't it still show up in the stage names?
Signed-off-by: Hao Chen <chenh1024@gmail.com>
Signed-off-by: Hao Chen <chenh1024@gmail.com>
Signed-off-by: Hao Chen <chenh1024@gmail.com>
Signed-off-by: Hao Chen <chenh1024@gmail.com>
Signed-off-by: Hao Chen <chenh1024@gmail.com>
This reverts commit 0f375ff.
Signed-off-by: Hao Chen <chenh1024@gmail.com>
Signed-off-by: Hao Chen <chenh1024@gmail.com>
Signed-off-by: Hao Chen <chenh1024@gmail.com>
Signed-off-by: Hao Chen <chenh1024@gmail.com>
As this PR is getting huge, I'll split it into a few small PRs: the first two are #35621 and #35648. Other known issues not fixed in these 2 PRs include:
|
## Why are these changes needed? This PR is the 1st part of enabling optimizer by default (split from #34937). - Fix inconsistent behaviors for the Read op by reusing the `ReadTask`s from `read_api.py` in `plan_read_op.py`. - Support cache in `materialize`. Note, some changes in this PR may not be covered in this PR's CI, as the optimizer must be enabled to cover them. But they are already verified in #34937 CI). ## Related issue number #32596
## Why are these changes needed? This PR is the 2nd part of enabling optimizer by default (split from #34937). It fixes the following issues: - `ray_remote_args` not correctly set for a fused operator. - `init_fn` not correctly set for a fused operator. - Allowed cases for fusion (see `operator_fusion.py`). - `ray_remote_args` compatibility check for fusion. - Limit operator not handled when converting logical operator to physical. - Other small fixes. Note, some changes in this PR may not be covered in this PR's CI, as the optimizer must be enabled to cover them. But they are already verified in #34937's CI). ## Related issue number #32596
…5648) ## Why are these changes needed? This PR is the 1st part of enabling optimizer by default (split from ray-project#34937). - Fix inconsistent behaviors for the Read op by reusing the `ReadTask`s from `read_api.py` in `plan_read_op.py`. - Support cache in `materialize`. Note, some changes in this PR may not be covered in this PR's CI, as the optimizer must be enabled to cover them. But they are already verified in ray-project#34937 CI). ## Related issue number ray-project#32596
## Why are these changes needed? This PR is the 2nd part of enabling optimizer by default (split from ray-project#34937). It fixes the following issues: - `ray_remote_args` not correctly set for a fused operator. - `init_fn` not correctly set for a fused operator. - Allowed cases for fusion (see `operator_fusion.py`). - `ray_remote_args` compatibility check for fusion. - Limit operator not handled when converting logical operator to physical. - Other small fixes. Note, some changes in this PR may not be covered in this PR's CI, as the optimizer must be enabled to cover them. But they are already verified in ray-project#34937's CI). ## Related issue number ray-project#32596
…5648) ## Why are these changes needed? This PR is the 1st part of enabling optimizer by default (split from ray-project#34937). - Fix inconsistent behaviors for the Read op by reusing the `ReadTask`s from `read_api.py` in `plan_read_op.py`. - Support cache in `materialize`. Note, some changes in this PR may not be covered in this PR's CI, as the optimizer must be enabled to cover them. But they are already verified in ray-project#34937 CI). ## Related issue number ray-project#32596 Signed-off-by: e428265 <arvind.chandramouli@lmco.com>
## Why are these changes needed? This PR is the 2nd part of enabling optimizer by default (split from ray-project#34937). It fixes the following issues: - `ray_remote_args` not correctly set for a fused operator. - `init_fn` not correctly set for a fused operator. - Allowed cases for fusion (see `operator_fusion.py`). - `ray_remote_args` compatibility check for fusion. - Limit operator not handled when converting logical operator to physical. - Other small fixes. Note, some changes in this PR may not be covered in this PR's CI, as the optimizer must be enabled to cover them. But they are already verified in ray-project#34937's CI). ## Related issue number ray-project#32596 Signed-off-by: e428265 <arvind.chandramouli@lmco.com>
Why are these changes needed?
This PR enables the execution plan optimizer in Ray Data, and fixes some bugs discovered via unit tests. We will ensure that Data CI and release tests are healthy before merging.
Related issue number
Closes #32596
Checks
git commit -s
) in this PR.scripts/format.sh
to lint the changes in this PR.method in Tune, I've added it in
doc/source/tune/api/
under thecorresponding
.rst
file.