You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2021/07/15 02:15:46 UTC
Build failed in Jenkins: beam_PostCommit_Java_DataflowV2 #414
See <https://ci-beam.apache.org/job/beam_PostCommit_Java_DataflowV2/414/display/redirect?page=changes>
Changes:
[kawaigin] Misc Fixes
[noreply] [BEAM-12611] Add Instruction ID to LogEntry, by introducing
[kawaigin] Updated screendiff golden screenshots for Linux platforms.
------------------------------------------
[...truncated 59.78 KB...]
INFO:root:pull_licenses_java.py succeed. It took 3.889294 seconds with 16 threads.
> Task :sdks:java:container:java8:copyJavaThirdPartyLicenses NO-SOURCE
> Task :runners:java-fn-execution:compileJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
> Task :runners:java-fn-execution:classes
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :runners:direct-java:compileJava FROM-CACHE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :runners:direct-java:shadowJar FROM-CACHE
> Task :sdks:java:container:java8:copyDockerfileDependencies
> Task :sdks:java:extensions:ml:compileJava FROM-CACHE
> Task :sdks:java:extensions:ml:classes UP-TO-DATE
> Task :sdks:java:extensions:ml:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :examples:java:compileJava FROM-CACHE
> Task :examples:java:classes UP-TO-DATE
> Task :examples:java:jar
> Task :sdks:java:io:google-cloud-platform:compileTestJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:testClasses
> Task :examples:java:compileTestJava FROM-CACHE
> Task :examples:java:testClasses
> Task :examples:java:testJar
> Task :sdks:java:io:google-cloud-platform:testJar
> Task :sdks:go:resolveBuildDependencies
Resolving google.golang.org/api: commit='0324d5e90dc7753607860272666845fad9ceb97e', urls=[https://code.googlesource.com/google-api-go-client]
Resolving google.golang.org/genproto: commit='4d944d34d83c502a5f761500a14d8842648415c3', urls=[https://github.com/google/go-genproto]
Resolving google.golang.org/grpc: commit='5e8f83304c0563d1ba74db05fee83d9c18ab9a58', urls=[https://github.com/grpc/grpc-go]
Resolving google.golang.org/protobuf: commit='d165be301fb1e13390ad453281ded24385fd8ebc', urls=[https://go.googlesource.com/protobuf]
Resolving cached github.com/etcd-io/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/etcd-io/etcd.git, git@github.com:etcd-io/etcd.git]
Resolving cached github.com/etcd-io/etcd: commit='11214aa33bf5a47d3d9d8dafe0f6b97237dfe921', urls=[https://github.com/etcd-io/etcd.git, git@github.com:etcd-io/etcd.git]
> Task :sdks:go:installDependencies
> Task :sdks:go:buildLinuxAmd64
> Task :sdks:go:goBuild
> Task :sdks:java:container:resolveBuildDependencies
Resolving ./github.com/apache/beam/sdks/go@<https://ci-beam.apache.org/job/beam_PostCommit_Java_DataflowV2/ws/src/sdks/go>
> Task :sdks:java:container:installDependencies
> Task :sdks:java:container:buildLinuxAmd64
> Task :sdks:java:container:goBuild
> Task :sdks:java:container:java8:copySdkHarnessLauncher
> Task :sdks:java:container:java8:dockerPrepare
> Task :sdks:java:container:java8:docker
> Task :runners:google-cloud-dataflow-java:buildAndPushDockerContainer
WARNING: `gcloud docker` will not be supported for Docker client versions above 18.03.
As an alternative, use `gcloud auth configure-docker` to configure `docker` to
use `gcloud` as a credential helper, then use `docker` as you would for non-GCR
registries, e.g. `docker pull gcr.io/project-id/my-image`. Add
`--verbosity=error` to silence this warning: `gcloud docker
--verbosity=error -- pull gcr.io/project-id/my-image`.
See: https://cloud.google.com/container-registry/docs/support/deprecation-notices#gcloud-docker
> Task :runners:google-cloud-dataflow-java:coreSDKJavaRunnerV2IntegrationTest NO-SOURCE
> Task :runners:google-cloud-dataflow-java:examplesJavaRunnerV2IntegrationTest
org.apache.beam.examples.snippets.transforms.io.gcp.bigquery.BigQuerySamplesIT > testTableIO FAILED
java.lang.RuntimeException at BigQuerySamplesIT.java:121
org.apache.beam.examples.cookbook.BigQueryTornadoesIT > testE2EBigQueryTornadoesWithExport FAILED
java.lang.RuntimeException at BigQueryTornadoesIT.java:68
org.apache.beam.examples.cookbook.BigQueryTornadoesIT > testE2eBigQueryTornadoesWithStorageApi FAILED
java.lang.RuntimeException at BigQueryTornadoesIT.java:68
org.apache.beam.examples.cookbook.BigQueryTornadoesIT > testE2eBigQueryTornadoesWithStorageApiUsingQuery FAILED
java.lang.RuntimeException at BigQueryTornadoesIT.java:68
org.apache.beam.examples.complete.TrafficMaxLaneFlowIT > testE2ETrafficMaxLaneFlow FAILED
java.lang.RuntimeException at TrafficMaxLaneFlowIT.java:74
8 tests completed, 5 failed
> Task :runners:google-cloud-dataflow-java:examplesJavaRunnerV2IntegrationTest FAILED
> Task :runners:google-cloud-dataflow-java:googleCloudPlatformRunnerV2IntegrationTest
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOReadIT > testBigQueryReadEmpty FAILED
java.lang.RuntimeException at BigQueryIOReadIT.java:86
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOStorageReadIT > testBigQueryStorageRead1GAvro FAILED
java.lang.RuntimeException at BigQueryIOStorageReadIT.java:102
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOStorageQueryIT > testBigQueryStorageQuery1G FAILED
java.lang.RuntimeException at BigQueryIOStorageQueryIT.java:94
org.apache.beam.sdk.io.gcp.bigquery.BigQueryTimePartitioningClusteringIT > testE2EBigQueryClusteringTableFunction FAILED
java.lang.RuntimeException at BigQueryTimePartitioningClusteringIT.java:198
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOStorageReadIT > testBigQueryStorageRead1GArrow FAILED
java.lang.RuntimeException at BigQueryIOStorageReadIT.java:102
org.apache.beam.sdk.io.gcp.bigquery.BigQueryTimePartitioningClusteringIT > testE2EBigQueryClusteringDynamicDestinations FAILED
java.lang.RuntimeException at BigQueryTimePartitioningClusteringIT.java:221
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOReadIT > testBigQueryRead1G FAILED
java.lang.RuntimeException at BigQueryIOReadIT.java:86
org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaUpdateOptionsIT > testAllowFieldAddition FAILED
java.lang.RuntimeException at BigQuerySchemaUpdateOptionsIT.java:154
org.apache.beam.sdk.io.gcp.healthcare.HL7v2IOReadWriteIT > testHL7v2IOE2E FAILED
java.lang.RuntimeException at HL7v2IOReadWriteIT.java:119
org.apache.beam.sdk.io.gcp.bigquery.BigQueryTimePartitioningClusteringIT > testE2EBigQueryTimePartitioning FAILED
java.lang.RuntimeException at BigQueryTimePartitioningClusteringIT.java:144
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOReadIT > testBigQueryRead1M FAILED
java.lang.RuntimeException at BigQueryIOReadIT.java:86
org.apache.beam.sdk.io.gcp.bigquery.BigQuerySchemaUpdateOptionsIT > testAllowFieldRelaxation FAILED
java.lang.RuntimeException at BigQuerySchemaUpdateOptionsIT.java:154
org.apache.beam.sdk.io.gcp.datastore.V1ReadIT > testE2EV1Read FAILED
java.lang.RuntimeException at V1ReadIT.java:92
org.apache.beam.sdk.io.gcp.bigquery.BigQueryTimePartitioningClusteringIT > testE2EBigQueryClustering FAILED
java.lang.RuntimeException at BigQueryTimePartitioningClusteringIT.java:169
org.apache.beam.sdk.io.gcp.pubsub.PubsubReadIT > testReadPubsubMessageId FAILED
java.lang.AssertionError at PubsubReadIT.java:88
org.apache.beam.sdk.io.gcp.bigquery.BigQueryIOReadIT > testBigQueryRead1T FAILED
java.lang.RuntimeException at BigQueryIOReadIT.java:86
org.apache.beam.sdk.io.gcp.pubsub.PubsubReadIT > testReadPublicData FAILED
java.lang.AssertionError at PubsubReadIT.java:60
org.apache.beam.sdk.io.gcp.datastore.V1ReadIT > testE2EV1ReadWithGQLQueryWithLimit FAILED
java.lang.RuntimeException at V1ReadIT.java:135
org.apache.beam.sdk.io.gcp.bigquery.BigQueryToTableIT > testNewTypesQueryWithReshuffle FAILED
java.lang.RuntimeException at BigQueryToTableIT.java:116
org.apache.beam.sdk.io.gcp.bigquery.BigQueryNestedRecordsIT > testNestedRecords FAILED
java.lang.RuntimeException at BigQueryNestedRecordsIT.java:116
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testQuery FAILED
java.lang.RuntimeException at SpannerReadIT.java:169
org.apache.beam.sdk.io.gcp.datastore.V1ReadIT > testE2EV1ReadWithGQLQueryWithNoLimit FAILED
java.lang.RuntimeException at V1ReadIT.java:135
org.apache.beam.sdk.io.gcp.bigquery.BigQueryToTableIT > testLegacyQueryWithoutReshuffle FAILED
java.lang.RuntimeException at BigQueryToTableIT.java:116
org.apache.beam.sdk.io.gcp.bigtable.BigtableWriteIT > testE2EBigtableWrite FAILED
java.lang.RuntimeException at BigtableWriteIT.java:130
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testRead FAILED
java.lang.RuntimeException at SpannerReadIT.java:149
org.apache.beam.sdk.io.gcp.bigquery.BigQueryToTableIT > testNewTypesQueryWithoutReshuffle FAILED
java.lang.RuntimeException at BigQueryToTableIT.java:116
org.apache.beam.sdk.io.gcp.healthcare.HL7v2IOWriteIT > testHL7v2IOWrite FAILED
java.lang.RuntimeException at HL7v2IOWriteIT.java:92
org.apache.beam.sdk.io.gcp.spanner.SpannerReadIT > testReadAllRecordsInDb FAILED
java.lang.RuntimeException at SpannerReadIT.java:201
org.apache.beam.sdk.io.gcp.bigquery.BigQueryToTableIT > testStandardQueryWithoutCustom FAILED
java.lang.RuntimeException at BigQueryToTableIT.java:116
org.apache.beam.sdk.io.gcp.bigtable.BigtableReadIT > testE2EBigtableRead FAILED
java.lang.RuntimeException at BigtableReadIT.java:58
org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1WriteWithLargeEntities FAILED
java.lang.RuntimeException at V1WriteIT.java:104
org.apache.beam.sdk.io.gcp.datastore.V1WriteIT > testE2EV1Write FAILED
java.lang.RuntimeException at V1WriteIT.java:69
37 tests completed, 32 failed, 1 skipped
> Task :runners:google-cloud-dataflow-java:googleCloudPlatformRunnerV2IntegrationTest FAILED
> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages FAILED
ERROR: (gcloud.container.images.delete) [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210715000118] is not a valid name. Expected tag in the form "base:tag" or "tag" or digest in the form "sha256:<digest>"
FAILURE: Build completed with 3 failures.
1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:examplesJavaRunnerV2IntegrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PostCommit_Java_DataflowV2/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/examplesJavaRunnerV2IntegrationTest/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:googleCloudPlatformRunnerV2IntegrationTest'.
> There were failing tests. See the report at: file://<https://ci-beam.apache.org/job/beam_PostCommit_Java_DataflowV2/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/googleCloudPlatformRunnerV2IntegrationTest/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Build file '<https://ci-beam.apache.org/job/beam_PostCommit_Java_DataflowV2/ws/src/runners/google-cloud-dataflow-java/build.gradle'> line: 279
* What went wrong:
Execution failed for task ':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
> Process 'command 'gcloud'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 2h 15m 9s
135 actionable tasks: 96 executed, 37 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/wi32tew2rybnw
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Jenkins build is back to normal : beam_PostCommit_Java_DataflowV2
#415
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_PostCommit_Java_DataflowV2/415/display/redirect>
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org