-
Notifications
You must be signed in to change notification settings - Fork 244
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix Parquet tests for Databricks 14.3 #11531
Comments
This can be ported from #11519 |
Hey, @gerashegalov. Would it be alright if I took this one? Edit: I've modified #11519 to include a fix for this failure. |
mythrocks
added a commit
to mythrocks/spark-rapids
that referenced
this issue
Oct 4, 2024
OK assigning it to you @mythrocks |
mythrocks
added a commit
that referenced
this issue
Oct 8, 2024
* Spark 4: Fix parquet_test.py. Fixes #11015. (Spark 4 failure.) Also fixes #11531. (Databricks 14.3 failure.) Contributes to #11004. This commit addresses the tests that fail in parquet_test.py, when run on Spark 4. 1. Some of the tests were failing as a result of #5114. Those tests have been disabled, at least until we get around to supporting aggregations with ANSI mode enabled. 2. `test_parquet_check_schema_compatibility` fails on Spark 4 regardless of ANSI mode, because it tests implicit type promotions where the read schema includes wider columns than the write schema. This will require new code. The test is disabled until #11512 is addressed. 3. `test_parquet_int32_downcast` had an erroneous setup phase that fails in ANSI mode. This has been corrected. The test was refactored to run in ANSI and non-ANSI mode. Signed-off-by: MithunR <mithunr@nvidia.com>
Closing this; we've addressed the failure in #11519. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Build the plugin against the Databricks 14.3 cluster using #11467. Once built successfully run the parquet tests by
TESTS=parquet_test.py jenkins/databricks/test.sh
The following tests fail
The text was updated successfully, but these errors were encountered: