Skip to content

[MINOR][DNM] Testing bundle validation for 1.0.1-rc1 #78

[MINOR][DNM] Testing bundle validation for 1.0.1-rc1

[MINOR][DNM] Testing bundle validation for 1.0.1-rc1 #78

Triggered via pull request February 1, 2025 21:03
Status Success
Total duration 19m 24s
Artifacts
Matrix: validate-release-candidate-bundles
Fit to window
Zoom out
Zoom in

Annotations

3 errors and 217 warnings
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
Trying to access closed classloader. Please check if you store classloaders directly or indirectly in static fields. If the stacktrace suggests that the leak occurs in a third party library and cannot be fixed immediately, you can disable this check with the configuration 'classloader.check-leaked-classloader'.
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
Trying to access closed classloader. Please check if you store classloaders directly or indirectly in static fields. If the stacktrace suggests that the leak occurs in a third party library and cannot be fixed immediately, you can disable this check with the configuration 'classloader.check-leaked-classloader'.
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
Trying to access closed classloader. Please check if you store classloaders directly or indirectly in static fields. If the stacktrace suggests that the leak occurs in a third party library and cannot be fixed immediately, you can disable this check with the configuration 'classloader.check-leaked-classloader'.
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh validating deltastreamer in spark shell
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh validating deltastreamer in spark shell
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
Error: Path Validation Error: Path(s) specified in the action for caching do(es) not exist, hence no cache is being saved.
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-release-candidate-bundles (scala-2.13, flink1.20, spark3.5, spark3.5.1)
validate.sh validating deltastreamer in spark shell
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh validating deltastreamer in spark shell
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
Error: Path Validation Error: Path(s) specified in the action for caching do(es) not exist, hence no cache is being saved.
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh validating deltastreamer in spark shell
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-release-candidate-bundles (scala-2.13, flink1.19, spark3.5, spark3.5.1)
validate.sh validating deltastreamer in spark shell
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
Error: Path Validation Error: Path(s) specified in the action for caching do(es) not exist, hence no cache is being saved.
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh validating deltastreamer in spark shell
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh validating deltastreamer in spark shell
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-release-candidate-bundles (scala-2.13, flink1.18, spark3.5, spark3.5.1)
validate.sh validating deltastreamer in spark shell
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh skip validating utilities bundle for non-spark3.5 build
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh validating utilities slim bundle
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh skip validating utilities bundle for non-spark3.5 build
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh validating utilities slim bundle
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
Error: Path Validation Error: Path(s) specified in the action for caching do(es) not exist, hence no cache is being saved.
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh skip validating utilities bundle for non-spark3.5 build
validate-release-candidate-bundles (scala-2.12, flink1.15, spark3.3, spark3.3.4)
validate.sh validating utilities slim bundle
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh skip validating utilities bundle for non-spark3.5 build
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh validating utilities slim bundle
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh skip validating utilities bundle for non-spark3.5 build
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh validating utilities slim bundle
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh skip validating utilities bundle for non-spark3.5 build
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
validate.sh validating utilities slim bundle
validate-release-candidate-bundles (scala-2.12, flink1.14, spark3.3, spark3.3.4)
Error: Path Validation Error: Path(s) specified in the action for caching do(es) not exist, hence no cache is being saved.
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh skip validating utilities bundle for non-spark3.5 build
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh validating utilities slim bundle
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
Error: Path Validation Error: Path(s) specified in the action for caching do(es) not exist, hence no cache is being saved.
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh skip validating utilities bundle for non-spark3.5 build
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh validating utilities slim bundle
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh skip validating utilities bundle for non-spark3.5 build
validate-release-candidate-bundles (scala-2.12, flink1.16, spark3.4, spark3.4.3)
validate.sh validating utilities slim bundle
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
Error: Path Validation Error: Path(s) specified in the action for caching do(es) not exist, hence no cache is being saved.
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh Query and validate the results using Spark SQL
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh Query and validate the results using HiveQL
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-release-candidate-bundles (scala-2.12, flink1.17, spark3.5, spark3.5.1)
validate.sh validating utilities bundle