-
Notifications
You must be signed in to change notification settings - Fork 244
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix tests failures in window_function_test.py #11019
Labels
Comments
This was the one I was most worried about. It looks like the tests pass correctly when I'm inclined to have |
mythrocks
added a commit
to mythrocks/spark-rapids
that referenced
this issue
Jun 17, 2024
Fixes NVIDIA#11019. Window function tests fail on Spark 4.0 because of NVIDIA#5114 (and NVIDIA#5120 broadly), because spark-rapids does not support SUM, COUNT, and certain other aggregations in ANSI mode. This commit disables ANSI mode tests for the failing window function tests. These may be revisited, once error/overflow checking is available for ANSI mode in spark-rapids. Signed-off-by: MithunR <mithunr@nvidia.com>
wjxiz1992
added a commit
to nvliyuan/yuali-spark-rapids
that referenced
this issue
Jun 26, 2024
* optimzing Expand+Aggregate in sqlw with many count distinct Signed-off-by: Hongbin Ma (Mahone) <mahongbin@apache.org> * Add GpuBucketingUtils shim to Spark 4.0.0 (NVIDIA#11092) * Add GpuBucketingUtils shim to Spark 4.0.0 * Signing off Signed-off-by: Raza Jafri <rjafri@nvidia.com> --------- Signed-off-by: Raza Jafri <rjafri@nvidia.com> * Improve the diagnostics for 'conv' fallback explain (NVIDIA#11076) * Improve the diagnostics for 'conv' fallback explain Signed-off-by: Jihoon Son <ghoonson@gmail.com> * don't use nil Signed-off-by: Jihoon Son <ghoonson@gmail.com> * the bases should not be an empty string in the error message when the user input is not Signed-off-by: Jihoon Son <ghoonson@gmail.com> * more user-friendly message * Update sql-plugin/src/main/scala/org/apache/spark/sql/rapids/stringFunctions.scala Co-authored-by: Gera Shegalov <gshegalov@nvidia.com> --------- Signed-off-by: Jihoon Son <ghoonson@gmail.com> Co-authored-by: Gera Shegalov <gshegalov@nvidia.com> * Disable ANSI mode for window function tests [databricks] (NVIDIA#11073) * Disable ANSI mode for window function tests. Fixes NVIDIA#11019. Window function tests fail on Spark 4.0 because of NVIDIA#5114 (and NVIDIA#5120 broadly), because spark-rapids does not support SUM, COUNT, and certain other aggregations in ANSI mode. This commit disables ANSI mode tests for the failing window function tests. These may be revisited, once error/overflow checking is available for ANSI mode in spark-rapids. Signed-off-by: MithunR <mithunr@nvidia.com> * Switch from @ansi_mode_disabled to @disable_ansi_mode. --------- Signed-off-by: MithunR <mithunr@nvidia.com> --------- Signed-off-by: Hongbin Ma (Mahone) <mahongbin@apache.org> Signed-off-by: Raza Jafri <rjafri@nvidia.com> Signed-off-by: Jihoon Son <ghoonson@gmail.com> Signed-off-by: MithunR <mithunr@nvidia.com> Co-authored-by: Hongbin Ma (Mahone) <mahongbin@apache.org> Co-authored-by: Raza Jafri <razajafri@users.noreply.github.com> Co-authored-by: Jihoon Son <jihoonson@apache.org> Co-authored-by: Gera Shegalov <gshegalov@nvidia.com> Co-authored-by: MithunR <mithunr@nvidia.com>
SurajAralihalli
pushed a commit
to SurajAralihalli/spark-rapids
that referenced
this issue
Jul 12, 2024
* Disable ANSI mode for window function tests. Fixes NVIDIA#11019. Window function tests fail on Spark 4.0 because of NVIDIA#5114 (and NVIDIA#5120 broadly), because spark-rapids does not support SUM, COUNT, and certain other aggregations in ANSI mode. This commit disables ANSI mode tests for the failing window function tests. These may be revisited, once error/overflow checking is available for ANSI mode in spark-rapids. Signed-off-by: MithunR <mithunr@nvidia.com> * Switch from @ansi_mode_disabled to @disable_ansi_mode. --------- Signed-off-by: MithunR <mithunr@nvidia.com>
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
The text was updated successfully, but these errors were encountered: