From a511ca13ab392a620e2731d217cc273de9cf1b10 Mon Sep 17 00:00:00 2001 From: Dongjoon Hyun Date: Sat, 12 Mar 2022 02:47:57 -0800 Subject: [PATCH] [SPARK-38534][SQL][TESTS] Disable `to_timestamp('366', 'DD')` test case ### What changes were proposed in this pull request? This PR aims to disable `to_timestamp('366', 'DD')` to recover `ansi` test suite in Java11+. ### Why are the changes needed? Currently, Daily Java 11 and 17 GitHub Action jobs are broken. - https://github.com/apache/spark/runs/5511239176?check_suite_focus=true - https://github.com/apache/spark/runs/5513540604?check_suite_focus=true **Java 8** ``` $ bin/spark-shell --conf spark.sql.ansi.enabled=true Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 22/03/12 00:59:31 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Spark context Web UI available at http://172.16.0.31:4040 Spark context available as 'sc' (master = local[*], app id = local-1647075572229). Spark session available as 'spark'. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 3.3.0-SNAPSHOT /_/ Using Scala version 2.12.15 (OpenJDK 64-Bit Server VM, Java 1.8.0_322) Type in expressions to have them evaluated. Type :help for more information. scala> sql("select to_timestamp('366', 'DD')").show java.time.format.DateTimeParseException: Text '366' could not be parsed, unparsed text found at index 2. If necessary set spark.sql.ansi.enabled to false to bypass this error. ``` **Java 11+** ``` $ bin/spark-shell --conf spark.sql.ansi.enabled=true Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 22/03/12 01:00:07 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Spark context Web UI available at http://172.16.0.31:4040 Spark context available as 'sc' (master = local[*], app id = local-1647075607932). Spark session available as 'spark'. Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 3.3.0-SNAPSHOT /_/ Using Scala version 2.12.15 (OpenJDK 64-Bit Server VM, Java 11.0.12) Type in expressions to have them evaluated. Type :help for more information. scala> sql("select to_timestamp('366', 'DD')").show java.time.DateTimeException: Invalid date 'DayOfYear 366' as '1970' is not a leap year. If necessary set spark.sql.ansi.enabled to false to bypass this error. ``` ### Does this PR introduce _any_ user-facing change? No. ### How was this patch tested? Test with Java 11+. **BEFORE** ``` $ java -version openjdk version "17.0.2" 2022-01-18 LTS OpenJDK Runtime Environment Zulu17.32+13-CA (build 17.0.2+8-LTS) OpenJDK 64-Bit Server VM Zulu17.32+13-CA (build 17.0.2+8-LTS, mixed mode, sharing) $ build/sbt "sql/testOnly org.apache.spark.sql.SQLQueryTestSuite -- -z ansi/datetime-parsing-invalid.sql" ... [info] SQLQueryTestSuite: 01:23:00.219 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 01:23:05.209 ERROR org.apache.spark.sql.SQLQueryTestSuite: Error using configs: [info] - ansi/datetime-parsing-invalid.sql *** FAILED *** (267 milliseconds) [info] ansi/datetime-parsing-invalid.sql [info] Expected "java.time.[format.DateTimeParseException [info] Text '366' could not be parsed, unparsed text found at index 2]. If necessary set s...", but got "java.time.[DateTimeException [info] Invalid date 'DayOfYear 366' as '1970' is not a leap year]. If necessary set s..." Result did not match for query #8 [info] select to_timestamp('366', 'DD') (SQLQueryTestSuite.scala:476) ... [info] Run completed in 7 seconds, 389 milliseconds. [info] Total number of tests run: 1 [info] Suites: completed 1, aborted 0 [info] Tests: succeeded 0, failed 1, canceled 0, ignored 0, pending 0 [info] *** 1 TEST FAILED *** [error] Failed tests: [error] org.apache.spark.sql.SQLQueryTestSuite [error] (sql / Test / testOnly) sbt.TestsFailedException: Tests unsuccessful [error] Total time: 21 s, completed Mar 12, 2022, 1:23:05 AM ``` **AFTER** ``` $ build/sbt "sql/testOnly org.apache.spark.sql.SQLQueryTestSuite -- -z ansi/datetime-parsing-invalid.sql" ... [info] SQLQueryTestSuite: [info] - ansi/datetime-parsing-invalid.sql (390 milliseconds) ... [info] Run completed in 7 seconds, 673 milliseconds. [info] Total number of tests run: 1 [info] Suites: completed 1, aborted 0 [info] Tests: succeeded 1, failed 0, canceled 0, ignored 0, pending 0 [info] All tests passed. [success] Total time: 20 s, completed Mar 12, 2022, 1:24:52 AM ``` Closes #35825 from dongjoon-hyun/SPARK-38534. Authored-by: Dongjoon Hyun Signed-off-by: Dongjoon Hyun --- .../sql-tests/inputs/datetime-parsing-invalid.sql | 3 ++- .../results/ansi/datetime-parsing-invalid.sql.out | 11 +---------- .../results/datetime-parsing-invalid.sql.out | 10 +--------- 3 files changed, 4 insertions(+), 20 deletions(-) diff --git a/sql/core/src/test/resources/sql-tests/inputs/datetime-parsing-invalid.sql b/sql/core/src/test/resources/sql-tests/inputs/datetime-parsing-invalid.sql index a6d743cab5480..1d1e2a5282c81 100644 --- a/sql/core/src/test/resources/sql-tests/inputs/datetime-parsing-invalid.sql +++ b/sql/core/src/test/resources/sql-tests/inputs/datetime-parsing-invalid.sql @@ -14,7 +14,8 @@ select to_timestamp('366', 'D'); select to_timestamp('9', 'DD'); -- in java 8 this case is invalid, but valid in java 11, disabled for jenkins -- select to_timestamp('100', 'DD'); -select to_timestamp('366', 'DD'); +-- The error message is changed since Java 11+ +-- select to_timestamp('366', 'DD'); select to_timestamp('9', 'DDD'); select to_timestamp('99', 'DDD'); select to_timestamp('30-365', 'dd-DDD'); diff --git a/sql/core/src/test/resources/sql-tests/results/ansi/datetime-parsing-invalid.sql.out b/sql/core/src/test/resources/sql-tests/results/ansi/datetime-parsing-invalid.sql.out index 5dc3b85b3a9eb..59761d5ac53f0 100644 --- a/sql/core/src/test/resources/sql-tests/results/ansi/datetime-parsing-invalid.sql.out +++ b/sql/core/src/test/resources/sql-tests/results/ansi/datetime-parsing-invalid.sql.out @@ -1,5 +1,5 @@ -- Automatically generated by SQLQueryTestSuite --- Number of queries: 29 +-- Number of queries: 28 -- !query @@ -74,15 +74,6 @@ org.apache.spark.SparkUpgradeException You may get a different result due to the upgrading to Spark >= 3.0: Fail to parse '9' in the new parser. You can set spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before Spark 3.0, or set to CORRECTED and treat it as an invalid datetime string. --- !query -select to_timestamp('366', 'DD') --- !query schema -struct<> --- !query output -java.time.format.DateTimeParseException -Text '366' could not be parsed, unparsed text found at index 2. If necessary set spark.sql.ansi.enabled to false to bypass this error. - - -- !query select to_timestamp('9', 'DDD') -- !query schema diff --git a/sql/core/src/test/resources/sql-tests/results/datetime-parsing-invalid.sql.out b/sql/core/src/test/resources/sql-tests/results/datetime-parsing-invalid.sql.out index 33504709c08ec..9fc28876a5b2a 100644 --- a/sql/core/src/test/resources/sql-tests/results/datetime-parsing-invalid.sql.out +++ b/sql/core/src/test/resources/sql-tests/results/datetime-parsing-invalid.sql.out @@ -1,5 +1,5 @@ -- Automatically generated by SQLQueryTestSuite --- Number of queries: 29 +-- Number of queries: 28 -- !query @@ -72,14 +72,6 @@ org.apache.spark.SparkUpgradeException You may get a different result due to the upgrading to Spark >= 3.0: Fail to parse '9' in the new parser. You can set spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before Spark 3.0, or set to CORRECTED and treat it as an invalid datetime string. --- !query -select to_timestamp('366', 'DD') --- !query schema -struct --- !query output -NULL - - -- !query select to_timestamp('9', 'DDD') -- !query schema