Skip to content

[HUDI-8850] Fix record merge mode related issues and improve test coverage in Spark SQL #38355

[HUDI-8850] Fix record merge mode related issues and improve test coverage in Spark SQL

[HUDI-8850] Fix record merge mode related issues and improve test coverage in Spark SQL #38355

Triggered via pull request January 29, 2025 20:19
Status Cancelled
Total duration 19m 4s
Artifacts

bot.yml

on: pull_request
validate-source
2m 44s
validate-source
Matrix: build-spark-java17
Matrix: docker-java17-test
Matrix: integration-tests
Matrix: test-hudi-hadoop-mr-and-hudi-java-client
Matrix: test-spark-java-tests
Matrix: test-spark-java11-17-java-tests
Matrix: test-spark-java11-17-scala-tests
Matrix: test-spark-java17-java-tests
Matrix: test-spark-java17-scala-tests
Matrix: test-spark-scala-tests
Matrix: validate-bundles
Fit to window
Zoom out
Zoom in

Annotations

40 errors and 120 warnings
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
Canceling since a higher priority waiting request for 'refs/pull/12725/merge' exists
validate-bundles (scala-2.12, flink1.17, 1.11.1, spark3.5, spark3.5.1)
The operation was canceled.
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
Canceling since a higher priority waiting request for 'refs/pull/12725/merge' exists
validate-bundles (scala-2.12, flink1.16, 1.11.1, spark3.4, spark3.4.3)
The operation was canceled.
test-spark-java-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Canceling since a higher priority waiting request for 'refs/pull/12725/merge' exists
test-flink (flink1.20, 1.11.3)
Canceling since a higher priority waiting request for 'refs/pull/12725/merge' exists
test-flink (flink1.20, 1.11.3)
The operation was canceled.
test-spark-scala-tests (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
Canceling since a higher priority waiting request for 'refs/pull/12725/merge' exists
test-spark-java11-17-java-tests (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Canceling since a higher priority waiting request for 'refs/pull/12725/merge' exists
test-spark-java-tests (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Canceling since a higher priority waiting request for 'refs/pull/12725/merge' exists
validate-bundles (scala-2.12, flink1.14, 1.10.0, spark3.3, spark3.3.4)
Canceling since a higher priority waiting request for 'refs/pull/12725/merge' exists
validate-bundles (scala-2.12, flink1.14, 1.10.0, spark3.3, spark3.3.4)
The operation was canceled.
test-spark-java-tests (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
Canceling since a higher priority waiting request for 'refs/pull/12725/merge' exists
test-spark-java17-scala-tests (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
Canceling since a higher priority waiting request for 'refs/pull/12725/merge' exists
test-spark-java11-17-scala-tests (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Canceling since a higher priority waiting request for 'refs/pull/12725/merge' exists
test-spark-java17-scala-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Canceling since a higher priority waiting request for 'refs/pull/12725/merge' exists
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
Canceling since a higher priority waiting request for 'refs/pull/12725/merge' exists
validate-bundles (scala-2.12, flink1.15, 1.10.0, spark3.3, spark3.3.4)
The operation was canceled.
test-spark-java11-17-java-tests (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Canceling since a higher priority waiting request for 'refs/pull/12725/merge' exists
test-spark-scala-tests (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Canceling since a higher priority waiting request for 'refs/pull/12725/merge' exists
test-spark-java17-java-tests (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
Canceling since a higher priority waiting request for 'refs/pull/12725/merge' exists
integration-tests (spark3.5, spark-3.5.3/spark-3.5.3-bin-hadoop3.tgz)
Canceling since a higher priority waiting request for 'refs/pull/12725/merge' exists
integration-tests (spark3.5, spark-3.5.3/spark-3.5.3-bin-hadoop3.tgz)
The operation was canceled.
test-spark-java11-17-scala-tests (scala-2.13, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Canceling since a higher priority waiting request for 'refs/pull/12725/merge' exists
test-spark-scala-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Canceling since a higher priority waiting request for 'refs/pull/12725/merge' exists
test-spark-java17-java-tests (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Canceling since a higher priority waiting request for 'refs/pull/12725/merge' exists
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Building Hudi with Java 8
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Done building Hudi with Java 8
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh copying hadoop conf
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting hadoop hdfs
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:1
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:2
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:3
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting hadoop hdfs, hdfs report
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Running tests with Java 17
docker-java17-test (scala-2.13, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh run_docker_tests Running Hudi maven tests on Docker
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Building Hudi with Java 8
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Done building Hudi with Java 8
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh copying hadoop conf
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting hadoop hdfs
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:1
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:2
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:3
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh starting hadoop hdfs, hdfs report
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh Running tests with Java 17
docker-java17-test (scala-2.12, flink1.20, spark3.5, spark3.5.0)
docker_test_java17.sh run_docker_tests Running Hudi maven tests on Docker
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh Building Hudi with Java 8
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh Done building Hudi with Java 8
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh copying hadoop conf
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh starting hadoop hdfs
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh starting datanode:1
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh starting datanode:2
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh starting datanode:3
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh starting hadoop hdfs, hdfs report
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh Running tests with Java 17
docker-java17-test (scala-2.12, flink1.20, spark3.4, spark3.4.0)
docker_test_java17.sh run_docker_tests Running Hudi maven tests on Docker
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh running deltastreamer
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh validating deltastreamer in spark shell
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh running deltastreamer
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh validating deltastreamer in spark shell
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh running deltastreamer
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-bundles (scala-2.13, flink1.20, 1.11.3, spark3.5, spark3.5.1)
validate.sh validating deltastreamer in spark shell
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh running deltastreamer
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating deltastreamer in spark shell
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh running deltastreamer
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating deltastreamer in spark shell
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh running deltastreamer
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-bundles (scala-2.13, flink1.18, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating deltastreamer in spark shell
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh running deltastreamer
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating deltastreamer in spark shell
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh running deltastreamer
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating deltastreamer in spark shell
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
Use default java runtime under /opt/java/openjdk
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh spark & hadoop-mr bundles validation was successful.
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh done validating spark & hadoop-mr bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating utilities bundle
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh running deltastreamer
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh done with deltastreamer
validate-bundles (scala-2.13, flink1.19, 1.11.1, spark3.5, spark3.5.1)
validate.sh validating deltastreamer in spark shell