You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@beam.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2022/05/04 08:43:40 UTC
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #517
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/517/display/redirect?page=changes>
Changes:
[noreply] fix: JDBC config schema fields order
[Brian Hulette] Revert "Merge pull request #17255 from kileys/test-revert"
[Brian Hulette] BEAM-14231: bypass schema cache for
[noreply] [BEAM-13657] Follow up update version warning in __init__ (#17493)
[noreply] Merge pull request #17431 from [BEAM-14273] Add integration tests for BQ
[noreply] Merge pull request #17205 from [BEAM-14145] [Website] add carousel to
[noreply] [BEAM-14064] fix es io windowing (#17112)
[noreply] [BEAM-13670] Upgraded ipython from v7 to v8 (#17529)
[noreply] [BEAM-11104] Enable ProcessContinuation return values, add unit test
[Robert Bradshaw] [BEAM-14403] Allow Prime to be used with legacy workers.
[noreply] [BEAM-11106] Support drain in Go SDK (#17432)
[noreply] add __Init__ to inference. (#17514)
------------------------------------------
[...truncated 33.73 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\032\n\013NumElements\032\013:\t\n\003int\032\002\020\004\n\034\n\rInitialSplits\032\013:\t\n\003int\032\002\020\004\n\026\n\007KeySize\032\013:\t\n\003int\032\002\020\004\n\030\n\tValueSize\032\013:\t\n\003int\032\002\020\004\n\031\n\nNumHotKeys\032\013:\t\n\003int\032\002\020\004\n\024\n\016HotKeyFraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/04 08:43:33 Using specified **** binary: 'linux_amd64/combine'
2022/05/04 08:43:33 Prepared job with id: load-tests-go-flink-batch-combine-1-0428110756_b865d574-f5f2-4b2d-91dc-581c3682acee and staging token: load-tests-go-flink-batch-combine-1-0428110756_b865d574-f5f2-4b2d-91dc-581c3682acee
2022/05/04 08:43:36 Staged binary artifact with token:
2022/05/04 08:43:37 Submitted job: load0tests0go0flink0batch0combine0100428110756-root-0504084337-8683395f_c4d69cc4-0edf-4d32-8550-4ab4276d46bf
2022/05/04 08:43:37 Job state: STOPPED
2022/05/04 08:43:37 Job state: STARTING
2022/05/04 08:43:37 Job state: RUNNING
2022/05/04 08:43:38 (): org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: java.lang.IllegalArgumentException: Encountered unsupported logical type URN: int
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
at org.apache.beam.runners.core.construction.RehydratedComponents.getCoder(RehydratedComponents.java:168)
at org.apache.beam.runners.fnexecution.wire.WireCoders.instantiateRunnerWireCoder(WireCoders.java:94)
at org.apache.beam.runners.fnexecution.wire.WireCoders.instantiateRunnerWireCoder(WireCoders.java:75)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.translateExecutableStage(FlinkBatchPortablePipelineTranslator.java:311)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.translate(FlinkBatchPortablePipelineTranslator.java:272)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.translate(FlinkBatchPortablePipelineTranslator.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:115)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: java.lang.IllegalArgumentException: Encountered unsupported logical type URN: int
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
at org.apache.beam.runners.core.construction.RehydratedComponents.getCoder(RehydratedComponents.java:168)
at org.apache.beam.runners.core.construction.CoderTranslation.fromKnownCoder(CoderTranslation.java:158)
at org.apache.beam.runners.core.construction.CoderTranslation.fromProto(CoderTranslation.java:145)
at org.apache.beam.runners.core.construction.RehydratedComponents$2.load(RehydratedComponents.java:87)
at org.apache.beam.runners.core.construction.RehydratedComponents$2.load(RehydratedComponents.java:82)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
... 18 more
Caused by: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: java.lang.IllegalArgumentException: Encountered unsupported logical type URN: int
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
at org.apache.beam.runners.core.construction.RehydratedComponents.getCoder(RehydratedComponents.java:168)
at org.apache.beam.runners.core.construction.CoderTranslation.fromKnownCoder(CoderTranslation.java:158)
at org.apache.beam.runners.core.construction.CoderTranslation.fromProto(CoderTranslation.java:145)
at org.apache.beam.runners.core.construction.RehydratedComponents$2.load(RehydratedComponents.java:87)
at org.apache.beam.runners.core.construction.RehydratedComponents$2.load(RehydratedComponents.java:82)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
... 30 more
Caused by: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: java.lang.IllegalArgumentException: Encountered unsupported logical type URN: int
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
at org.apache.beam.runners.core.construction.RehydratedComponents.getCoder(RehydratedComponents.java:168)
at org.apache.beam.runners.core.construction.CoderTranslation.fromKnownCoder(CoderTranslation.java:158)
at org.apache.beam.runners.core.construction.CoderTranslation.fromProto(CoderTranslation.java:145)
at org.apache.beam.runners.core.construction.RehydratedComponents$2.load(RehydratedComponents.java:87)
at org.apache.beam.runners.core.construction.RehydratedComponents$2.load(RehydratedComponents.java:82)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
... 42 more
Caused by: java.lang.IllegalArgumentException: Encountered unsupported logical type URN: int
at org.apache.beam.sdk.schemas.SchemaTranslation.fieldTypeFromProtoWithoutNullable(SchemaTranslation.java:328)
at org.apache.beam.sdk.schemas.SchemaTranslation.fieldTypeFromProto(SchemaTranslation.java:244)
at org.apache.beam.sdk.schemas.SchemaTranslation.fieldFromProto(SchemaTranslation.java:238)
at org.apache.beam.sdk.schemas.SchemaTranslation.schemaFromProto(SchemaTranslation.java:212)
at org.apache.beam.runners.core.construction.CoderTranslators$8.fromComponents(CoderTranslators.java:169)
at org.apache.beam.runners.core.construction.CoderTranslators$8.fromComponents(CoderTranslators.java:151)
at org.apache.beam.runners.core.construction.CoderTranslation.fromKnownCoder(CoderTranslation.java:170)
at org.apache.beam.runners.core.construction.CoderTranslation.fromProto(CoderTranslation.java:145)
at org.apache.beam.runners.core.construction.RehydratedComponents$2.load(RehydratedComponents.java:87)
at org.apache.beam.runners.core.construction.RehydratedComponents$2.load(RehydratedComponents.java:82)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
... 54 more
2022/05/04 08:43:38 (): java.lang.IllegalArgumentException: Encountered unsupported logical type URN: int
2022/05/04 08:43:38 Job state: FAILED
2022/05/04 08:43:38 Failed to execute job: job load0tests0go0flink0batch0combine0100428110756-root-0504084337-8683395f_c4d69cc4-0edf-4d32-8550-4ab4276d46bf failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100428110756-root-0504084337-8683395f_c4d69cc4-0edf-4d32-8550-4ab4276d46bf failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf(0x123cb48, 0xc000130000, 0x11182df, 0x19, 0xc0005c9e78, 0x1, 0x1)
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xec
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:80 +0x414
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 24s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/4qbpbzcx4un4q
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Jenkins build is back to normal : beam_LoadTests_Go_Combine_Flink_Batch #739
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/739/display/redirect>
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #738
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/738/display/redirect?page=changes>
Changes:
[chamikaramj] Fixing a breakage of multi-lang auto Runner v2 enabling
[noreply] interface{} -> any for registration (#24600)
[noreply] Restrict tox to be in 3.x version (#24601)
[noreply] [Playground] support for Kafka-enabled examples (#24459)
[noreply] Fix some small notebook typos (#24616)
[noreply] initialize and increment metrics properly (#24592)
[noreply] Add schema conversion support from Kafka Connect Record schemas to Beam
[noreply] interface{} -> any for starcgen (#24618)
[noreply] interface{} -> any for remaining references (#24625)
[noreply] Updating issue-tagger Workflow (#171) (#23143)
[noreply] [GitHub Actions] - Updates in Build Playground Backend to runs-on
[noreply] [GitHub Actions] - Updates in Build Playground Frontend to runs-on
[noreply] [GitHub Actions] - Updates in Go Tests to runs-on Self-hosted runners
[noreply] [GitHub Actions] - Updates in Java Tests to runs-on Self-hosted runners
[noreply] Updated label_prs workflow (#173) (#23145)
[noreply] [CdapIO] CdapIO and SparkReceiverIO updates (#24436)
[noreply] Revert "[GitHub Actions] - Updates in Java Tests to runs-on Self-hosted
[noreply] Disallow sliding windows with combiner fanout to prevent data loss
------------------------------------------
[...truncated 34.09 KB...]
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c10"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:coder:iterable:v1"
>
component_coder_ids: "c0"
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c6"
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c7"
component_coder_ids: "c8"
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/12/10 08:45:39 Using specified **** binary: 'linux_amd64/combine'
2022/12/10 08:45:40 Prepared job with id: load-tests-go-flink-batch-combine-1-1210065322_09990da6-d9b0-40d2-9e00-d84a1c358482 and staging token: load-tests-go-flink-batch-combine-1-1210065322_09990da6-d9b0-40d2-9e00-d84a1c358482
2022/12/10 08:45:46 Staged binary artifact with token:
2022/12/10 08:45:47 Submitted job: load0tests0go0flink0batch0combine0101210065322-root-1210084546-3bde4c2a_649f9b7d-5466-44da-b839-ce5de39bed0c
2022/12/10 08:45:47 Job state: STOPPED
2022/12/10 08:45:47 Job state: STARTING
2022/12/10 08:45:47 Job state: RUNNING
2022/12/10 09:38:32 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 25a09c1b38ab2c147506b05ee3f8cd5f)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1670661745452_0002_01_000002(beam-loadtests-go-combine-flink-batch-738-w-2.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/12/10 09:38:32 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1670661745452_0002_01_000002(beam-loadtests-go-combine-flink-batch-738-w-2.c.apache-beam-testing.internal:8026) timed out.
2022/12/10 09:38:32 Job state: FAILED
2022/12/10 09:38:32 Failed to execute job: job load0tests0go0flink0batch0combine0101210065322-root-1210084546-3bde4c2a_649f9b7d-5466-44da-b839-ce5de39bed0c failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101210065322-root-1210084546-3bde4c2a_649f9b7d-5466-44da-b839-ce5de39bed0c failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1685788, 0xc000128000}, {0x14e68de?, 0x2048db8?}, {0xc00035be70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 53m 28s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/uny3q6afayc2y
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #737
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/737/display/redirect?page=changes>
Changes:
[Kenneth Knowles] Exclude IOs already split from Java Precommit job
[Kenneth Knowles] Move expansion services into appropriate precommits
[Kenneth Knowles] Split more IOs out of Java precommit
[Kenneth Knowles] Fix trigger paths for separated IOs
[Kenneth Knowles] Turn rawtype checking back on for core Java SDK
[noreply] [Tour Of Beam] Playground Router GRPC API host (#24542)
[noreply] Bump golang.org/x/net from 0.3.0 to 0.4.0 in /sdks (#24587)
[noreply] Replaced finalize with DoFn Teardown in Neo4jIO (#24571)
[Kenneth Knowles] Simplify bug report templates
[Kenneth Knowles] Fix bugs in issue template yml
[noreply] Fix issue templates (#24597)
[noreply] [#24024] Stop wrapping light weight functions with Contextful as they
[noreply] Sample window size as well (#24388)
[noreply] Implement Kafka Write Schema Transform (#24495)
[Kenneth Knowles] Eliminate null errors from JdbcIO
[noreply] docs(fix): Filter.whereFieldName(s?) ->
[hiandyzhang] ElasticsearchIO: Lower log level in flushBatch to avoid noisy log
------------------------------------------
[...truncated 604 B...]
Cloning repository https://github.com/apache/beam.git
> git init <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
> git --version # timeout=10
> git --version # 'git version 2.25.1'
> git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/apache/beam.git # timeout=10
> git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
> git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
> git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 58b4d46655d94374f3d3564752dc12eb98b95456 (origin/master)
> git config core.sparsecheckout # timeout=10
> git checkout -f 58b4d46655d94374f3d3564752dc12eb98b95456 # timeout=10
Commit message: "Merge pull request #24574: Turn rawtype checking back on for core Java SDK"
> git rev-list --no-walk 80980b8be48ece9c6d61dc28f429374b8e7a0e4b # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content
SETUPTOOLS_USE_DISTUTILS=stdlib
SPARK_LOCAL_IP=127.0.0.1
[EnvInject] - Variables injected successfully.
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content
CLUSTER_NAME=beam-loadtests-go-combine-flink-batch-737
FLINK_NUM_WORKERS=5
DETACHED_MODE=true
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest
GCS_BUCKET=gs://beam-flink-cluster
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest
HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar
GCLOUD_ZONE=us-central1-a
FLINK_TASKMANAGER_SLOTS=1
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.15.0/flink-1.15.0-bin-scala_2.12.tgz
ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-go-combine-flink-batch-737
[EnvInject] - Variables injected successfully.
[beam_LoadTests_Go_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins6000559029918395104.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_LoadTests_Go_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins3392836023063522059.sh
+ cd <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/.test-infra/dataproc>
+ ./flink_cluster.sh create
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=preview-debian11
++ echo us-central1-a
++ sed -E 's/(-[a-z])?$//'
+ GCLOUD_REGION=us-central1
+ MASTER_NAME=beam-loadtests-go-combine-flink-batch-737-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ YARN_APPLICATION_MASTER=
+ create
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][ 0.0 B/ 2.3 KiB] / [1 files][ 2.3 KiB/ 2.3 KiB] Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][ 2.3 KiB/ 6.0 KiB] / [2 files][ 6.0 KiB/ 6.0 KiB] Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][ 6.0 KiB/ 13.5 KiB] / [3 files][ 13.5 KiB/ 13.5 KiB] -
Operation completed over 3 objects/13.5 KiB.
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.15.0/flink-1.15.0-bin-scala_2.12.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest
+ local image_version=preview-debian11
+ echo 'Starting dataproc cluster. Dataproc version: preview-debian11'
Starting dataproc cluster. Dataproc version: preview-debian11
+ gcloud dataproc clusters create beam-loadtests-go-combine-flink-batch-737 --region=us-central1 --num-****s=5 --master-machine-type=n1-standard-2 --****-machine-type=n1-standard-2 --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.15.0/flink-1.15.0-bin-scala_2.12.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest, --image-version=preview-debian11 --zone=us-central1-a --optional-components=FLINK,DOCKER --quiet
Waiting on operation [projects/apache-beam-testing/regions/us-central1/operations/cf5406ff-6a24-381d-96c6-6199e0de6783].
Waiting for cluster creation operation...
WARNING: Consider using Auto Zone rather than selecting a zone manually. See https://cloud.google.com/dataproc/docs/concepts/configuring-clusters/auto-zone
.................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/us-central1/clusters/beam-loadtests-go-combine-flink-batch-737] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-go-combine-flink-batch-737-m '--command=yarn application -list'
++ grep 'Apache Flink'
Writing 3 keys to /home/jenkins/.ssh/google_compute_known_hosts
2022-12-09 08:44:20,200 INFO client.DefaultNoHARMFailoverProxyProvider: Connecting to ResourceManager at beam-loadtests-go-combine-flink-batch-737-m.c.apache-beam-testing.internal./10.128.0.195:8032
2022-12-09 08:44:20,475 INFO client.AHSProxy: Connecting to Application History server at beam-loadtests-go-combine-flink-batch-737-m.c.apache-beam-testing.internal./10.128.0.195:10200
+ read line
+ echo application_1670575344313_0002 flink-dataproc Apache Flink root default RUNNING UNDEFINED 100% http://10.128.0.193:32803
application_1670575344313_0002 flink-dataproc Apache Flink root default RUNNING UNDEFINED 100% http://10.128.0.193:32803
++ echo application_1670575344313_0002 flink-dataproc Apache Flink root default RUNNING UNDEFINED 100% http://10.128.0.193:32803
++ sed 's/ .*//'
+ application_ids[$i]=application_1670575344313_0002
++ echo application_1670575344313_0002 flink-dataproc Apache Flink root default RUNNING UNDEFINED 100% http://10.128.0.193:32803
++ sed -E 's#.*(https?://)##'
++ sed 's/ .*//'
+ application_masters[$i]=10.128.0.193:32803
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=10.128.0.193:32803
+ echo 'Using Yarn Application master: 10.128.0.193:32803'
Using Yarn Application master: 10.128.0.193:32803
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-go-combine-flink-batch-737-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest --flink-master=10.128.0.193:32803 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-go-combine-flink-batch-737'
Existing host keys found in /home/jenkins/.ssh/google_compute_known_hosts
Unable to find image 'gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest' locally
latest: Pulling from apache-beam-testing/beam_portability/beam_flink1.15_job_server
001c52e26ad5: Pulling fs layer
d9d4b9b6e964: Pulling fs layer
2068746827ec: Pulling fs layer
9daef329d350: Pulling fs layer
d85151f15b66: Pulling fs layer
52a8c426d30b: Pulling fs layer
8754a66e0050: Pulling fs layer
f84eb606444d: Pulling fs layer
0f3b111e627c: Pulling fs layer
6f880a280a05: Pulling fs layer
87c4199424f5: Pulling fs layer
4bfecfd5da75: Pulling fs layer
9daef329d350: Waiting
d85151f15b66: Waiting
52a8c426d30b: Waiting
8754a66e0050: Waiting
f84eb606444d: Waiting
0f3b111e627c: Waiting
6f880a280a05: Waiting
87c4199424f5: Waiting
4bfecfd5da75: Waiting
d9d4b9b6e964: Download complete
2068746827ec: Verifying Checksum
2068746827ec: Download complete
d85151f15b66: Verifying Checksum
d85151f15b66: Download complete
52a8c426d30b: Verifying Checksum
52a8c426d30b: Download complete
001c52e26ad5: Verifying Checksum
001c52e26ad5: Download complete
9daef329d350: Download complete
f84eb606444d: Verifying Checksum
f84eb606444d: Download complete
6f880a280a05: Verifying Checksum
6f880a280a05: Download complete
87c4199424f5: Verifying Checksum
87c4199424f5: Download complete
4bfecfd5da75: Verifying Checksum
4bfecfd5da75: Download complete
8754a66e0050: Verifying Checksum
8754a66e0050: Download complete
0f3b111e627c: Verifying Checksum
0f3b111e627c: Download complete
001c52e26ad5: Pull complete
d9d4b9b6e964: Pull complete
2068746827ec: Pull complete
9daef329d350: Pull complete
d85151f15b66: Pull complete
52a8c426d30b: Pull complete
8754a66e0050: Pull complete
f84eb606444d: Pull complete
0f3b111e627c: Pull complete
6f880a280a05: Pull complete
87c4199424f5: Pull complete
4bfecfd5da75: Pull complete
Digest: sha256:e00cc03108c819670154f58e0003f936edd94b0e359dd0891c812504a6b33b2c
Status: Downloaded newer image for gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest
6e9f402ee4646f465178ee760ba506ee26c8a0f5bcf1e09f1ff6088eccb32886
+ start_tunnel
++ gcloud compute ssh --quiet --zone=us-central1-a yarn@beam-loadtests-go-combine-flink-batch-737-m '--command=curl -s "http://10.128.0.193:32803/jobmanager/config"'
Existing host keys found in /home/jenkins/.ssh/google_compute_known_hosts
+ local 'job_server_config=[{"key":"blob.server.port","value":"41871"},{"key":"yarn.flink-dist-jar","value":"file:/usr/lib/flink/lib/flink-dist-1.15.0.jar"},{"key":"classloader.check-leaked-classloader","value":"False"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1670575344313_0002"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-go-combine-flink-batch-737-w-4.c.apache-beam-testing.internal"},{"key":"jobmanager.memory.jvm-overhead.min","value":"611948962b"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/root/appcache/application_1670575344313_0002"},{"key":"taskmanager.network.numberOfBuffers","value":"4096"},{"key":"parallelism.default","value":"8"},{"key":"taskmanager.numberOfTaskSlots","value":"2"},{"key":"env.hadoop.conf.dir","value":"/etc/hadoop/conf"},{"key":"yarn.application.name","value":"flink-dataproc"},{"key":"taskmanager.heap.mb","value":"5836"},{"key":"taskmanager.memory.process.size","value":"5836 mb"},{"key":"web.port","value":"0"},{"key":"jobmanager.heap.mb","value":"5836"},{"key":"jobmanager.memory.off-heap.size","value":"134217728b"},{"key":"execution.target","value":"yarn-session"},{"key":"jobmanager.memory.process.size","value":"5836 mb"},{"key":"web.tmpdir","value":"/tmp/flink-web-712a957a-e616-4cc7-9558-3667975ac359"},{"key":"jobmanager.rpc.port","value":"43491"},{"key":"rest.bind-address","value":"beam-loadtests-go-combine-flink-batch-737-w-4.c.apache-beam-testing.internal"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"execution.attached","value":"false"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"rest.address","value":"beam-loadtests-go-combine-flink-batch-737-w-4.c.apache-beam-testing.internal"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"jobmanager.memory.jvm-metaspace.size","value":"268435456b"},{"key":"$internal.deployment.config-dir","value":"/usr/lib/flink/conf"},{"key":"$internal.yarn.log-config-file","value":"/usr/lib/flink/conf/log4j.properties"},{"key":"jobmanager.memory.heap.size","value":"5104887390b"},{"key":"jobmanager.memory.jvm-overhead.max","value":"611948962b"}]'
+ local key=jobmanager.rpc.port
++ echo 10.128.0.193:32803
++ cut -d : -f1
+ local yarn_application_master_host=10.128.0.193
++ echo '[{"key":"blob.server.port","value":"41871"},{"key":"yarn.flink-dist-jar","value":"file:/usr/lib/flink/lib/flink-dist-1.15.0.jar"},{"key":"classloader.check-leaked-classloader","value":"False"},{"key":"jobmanager.execution.failover-strategy","value":"region"},{"key":"high-availability.cluster-id","value":"application_1670575344313_0002"},{"key":"jobmanager.rpc.address","value":"beam-loadtests-go-combine-flink-batch-737-w-4.c.apache-beam-testing.internal"},{"key":"jobmanager.memory.jvm-overhead.min","value":"611948962b"},{"key":"io.tmp.dirs","value":"/hadoop/yarn/nm-local-dir/usercache/root/appcache/application_1670575344313_0002"},{"key":"taskmanager.network.numberOfBuffers","value":"4096"},{"key":"parallelism.default","value":"8"},{"key":"taskmanager.numberOfTaskSlots","value":"2"},{"key":"env.hadoop.conf.dir","value":"/etc/hadoop/conf"},{"key":"yarn.application.name","value":"flink-dataproc"},{"key":"taskmanager.heap.mb","value":"5836"},{"key":"taskmanager.memory.process.size","value":"5836' 'mb"},{"key":"web.port","value":"0"},{"key":"jobmanager.heap.mb","value":"5836"},{"key":"jobmanager.memory.off-heap.size","value":"134217728b"},{"key":"execution.target","value":"yarn-session"},{"key":"jobmanager.memory.process.size","value":"5836' 'mb"},{"key":"web.tmpdir","value":"/tmp/flink-web-712a957a-e616-4cc7-9558-3667975ac359"},{"key":"jobmanager.rpc.port","value":"43491"},{"key":"rest.bind-address","value":"beam-loadtests-go-combine-flink-batch-737-w-4.c.apache-beam-testing.internal"},{"key":"internal.io.tmpdirs.use-local-default","value":"true"},{"key":"execution.attached","value":"false"},{"key":"internal.cluster.execution-mode","value":"NORMAL"},{"key":"rest.address","value":"beam-loadtests-go-combine-flink-batch-737-w-4.c.apache-beam-testing.internal"},{"key":"fs.hdfs.hadoopconf","value":"/etc/hadoop/conf"},{"key":"jobmanager.memory.jvm-metaspace.size","value":"268435456b"},{"key":"$internal.deployment.config-dir","value":"/usr/lib/flink/conf"},{"key":"$internal.yarn.log-config-file","value":"/usr/lib/flink/conf/log4j.properties"},{"key":"jobmanager.memory.heap.size","value":"5104887390b"},{"key":"jobmanager.memory.jvm-overhead.max","value":"611948962b"}]'
++ python -c 'import sys, json; print([e['\''value'\''] for e in json.load(sys.stdin) if e['\''key'\''] == u'\''jobmanager.rpc.port'\''][0])'
+ local jobmanager_rpc_port=43491
++ [[ true == \t\r\u\e ]]
++ echo ' -Nf >& /dev/null'
+ local 'detached_mode_params= -Nf >& /dev/null'
++ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.15_job_server:latest ]]
++ echo '-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'job_server_ports_forwarding=-L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097'
+ local 'tunnel_command=gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-go-combine-flink-batch-737-m -- -L 8081:10.128.0.193:32803 -L 43491:10.128.0.193:43491 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf >& /dev/null'
+ eval gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-go-combine-flink-batch-737-m -- -L 8081:10.128.0.193:32803 -L 43491:10.128.0.193:43491 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf '>&' /dev/null
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-go-combine-flink-batch-737-m -- -L 8081:10.128.0.193:32803 -L 43491:10.128.0.193:43491 -L 8099:localhost:8099 -L 8098:localhost:8098 -L 8097:localhost:8097 -D 1080 -Nf
[beam_LoadTests_Go_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins2278042159123416236.sh
+ echo '*** Combine Go Load test: 2GB of 10B records ***'
*** Combine Go Load test: 2GB of 10B records ***
[Gradle] - Launching build.
[src] $ <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/gradlew> -PloadTest.mainClass=combine -Prunner=FlinkRunner '-PloadTest.args=--job_name=load-tests-go-flink-batch-combine-1-1209065329 --influx_namespace=flink --influx_measurement=go_batch_combine_1 --input_options='{"num_records": 200000000,"key_size": 1,"value_size": 9}' --fanout=1 --top_count=20 --parallelism=5 --endpoint=localhost:8099 --environment_type=DOCKER --environment_config=gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest --influx_db_name=beam_test_metrics --influx_hostname=http://10.128.0.96:8086 --runner=FlinkRunner' --continue --max-****s=8 -Dorg.gradle.jvmargs=-Xms2g -Dorg.gradle.jvmargs=-Xmx6g -Dorg.gradle.vfs.watch=false -Pdocker-pull-licenses :sdks:go:test:load:run
Starting a Gradle Daemon, 2 busy Daemons could not be reused, use --status for details
Configuration on demand is an incubating feature.
> Task :buildSrc:compileJava NO-SOURCE
> Task :buildSrc:compileGroovy FROM-CACHE
> Task :buildSrc:pluginDescriptors
> Task :buildSrc:processResources
> Task :buildSrc:classes
> Task :buildSrc:jar
> Task :buildSrc:assemble
> Task :buildSrc:spotlessGroovy FROM-CACHE
> Task :buildSrc:spotlessGroovyCheck UP-TO-DATE
> Task :buildSrc:spotlessGroovyGradle FROM-CACHE
> Task :buildSrc:spotlessGroovyGradleCheck UP-TO-DATE
> Task :buildSrc:spotlessCheck UP-TO-DATE
> Task :buildSrc:pluginUnderTestMetadata
> Task :buildSrc:compileTestJava NO-SOURCE
> Task :buildSrc:compileTestGroovy NO-SOURCE
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validatePlugins FROM-CACHE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build
System Go installation: /snap/bin/go is go version go1.16.15 linux/amd64; Preparing to use /home/jenkins/go/bin/go1.19.3
go install golang.org/dl/go1.19.3@latest: no matching versions for query "latest"
FAILURE: Build failed with an exception.
* What went wrong:
Could not determine the dependencies of task ':sdks:go:test:load:goBuild'.
> Could not create task ':sdks:go:test:load:goPrepare'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 2m 11s
10 actionable tasks: 4 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/4dlrc2bl2s5ww
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #736
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/736/display/redirect?page=changes>
Changes:
[noreply] pubsub: fix typo in grpc client factory
[Kenneth Knowles] Suppress keyfor warnings
[Kenneth Knowles] Suppress checker warnings that are confusing and difficult
[Kenneth Knowles] Add @Pure annotations to MongoDbIO autovalue fields
[Kenneth Knowles] Suppress checker in FnApiDoFnRunner due to crash
[Kenneth Knowles] Suppress checker framework in Dataflow
[Kenneth Knowles] Fix some nullness errors in Spark runner
[Kenneth Knowles] Upgrade checker framework to 3.27.0
[noreply] Migrate testing subpackages from interface{} to any (#24570)
[noreply] fix go lints (#24566)
[noreply] Samza runner support for non unique stateId across multiple ParDos
[noreply] Bump to Hadoop 3.3.4 for performance tests (#24550)
[noreply] regenerate python dependencies (#24582)
[noreply] Return empty splits if unable to split, not errors (#24508)
------------------------------------------
[...truncated 34.08 KB...]
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c10"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:coder:iterable:v1"
>
component_coder_ids: "c0"
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c6"
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c7"
component_coder_ids: "c8"
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/12/08 08:46:08 Using specified **** binary: 'linux_amd64/combine'
2022/12/08 08:46:09 Prepared job with id: load-tests-go-flink-batch-combine-1-1208065321_a7d43c57-3d94-4f29-a4aa-e70292dc8c21 and staging token: load-tests-go-flink-batch-combine-1-1208065321_a7d43c57-3d94-4f29-a4aa-e70292dc8c21
2022/12/08 08:46:15 Staged binary artifact with token:
2022/12/08 08:46:16 Submitted job: load0tests0go0flink0batch0combine0101208065321-root-1208084615-660de1a2_f3683de6-60c5-49a2-b2fd-ee4100556fb3
2022/12/08 08:46:16 Job state: STOPPED
2022/12/08 08:46:16 Job state: STARTING
2022/12/08 08:46:16 Job state: RUNNING
2022/12/08 09:40:28 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 36fb95ea3a9c41ae59107f1979ac4e73)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1670488965772_0001_01_000002(beam-loadtests-go-combine-flink-batch-736-w-2.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/12/08 09:40:28 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1670488965772_0001_01_000002(beam-loadtests-go-combine-flink-batch-736-w-2.c.apache-beam-testing.internal:8026) timed out.
2022/12/08 09:40:29 Job state: FAILED
2022/12/08 09:40:29 Failed to execute job: job load0tests0go0flink0batch0combine0101208065321-root-1208084615-660de1a2_f3683de6-60c5-49a2-b2fd-ee4100556fb3 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101208065321-root-1208084615-660de1a2_f3683de6-60c5-49a2-b2fd-ee4100556fb3 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1683708, 0xc000128000}, {0x14e49c1?, 0x20447f8?}, {0xc0000f9e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 55m 14s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/vgwsxz5osia2k
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #735
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/735/display/redirect?page=changes>
Changes:
[bulatkazan] [Website] center the main content #24521
[Moritz Mack] [Spark Dataset runner] Broadcast pipeline options
[bulatkazan] [Website] update copy icon positioning #24426
[noreply] Update google-cloud-bigquery-storage requirement from <2.14,>=2.6.3 to
[noreply] Bump cloud.google.com/go/pubsub from 1.27.1 to 1.28.0 in /sdks (#24534)
[noreply] Bump golang.org/x/net from 0.2.0 to 0.3.0 in /sdks (#24544)
[noreply] Precommit python version update (#24526)
[noreply] [CdapIO] Add CdapIO and SparkReceiverIO documentation in website
[relax] fix null pointer exception caused by clearing member variable
[noreply] Adding support for Pubsub Lite Writes in SchemaTransforms (#24359)
[noreply] Disallow using the JRH with Python streaming pipelines (#24513)
[noreply] Add RunInference example for TensorFlow Hub pre-trained model (#24529)
[noreply] update(PULL Request template) remove Choose reviewer (#24540)
[noreply] Revert "Bump actions/setup-java from 3.6.0 to 3.7.0 (#24484)" (#24551)
[noreply] Interface{}->any for more subfolders (#24553)
[Kenneth Knowles] Support multiple gradle tasks in one precommit job
[Kenneth Knowles] Split up some IOs from Java PreCommit
------------------------------------------
[...truncated 34.17 KB...]
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c10"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:coder:iterable:v1"
>
component_coder_ids: "c0"
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c6"
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c7"
component_coder_ids: "c8"
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/12/07 08:45:47 Using specified **** binary: 'linux_amd64/combine'
2022/12/07 08:45:47 Prepared job with id: load-tests-go-flink-batch-combine-1-1207065321_4c863571-98c8-4055-a51b-032fadc75e11 and staging token: load-tests-go-flink-batch-combine-1-1207065321_4c863571-98c8-4055-a51b-032fadc75e11
2022/12/07 08:45:53 Staged binary artifact with token:
2022/12/07 08:45:55 Submitted job: load0tests0go0flink0batch0combine0101207065321-root-1207084553-f1233ef_f94ce1a6-ad25-47a5-8479-69eb16d2af84
2022/12/07 08:45:55 Job state: STOPPED
2022/12/07 08:45:55 Job state: STARTING
2022/12/07 08:45:55 Job state: RUNNING
2022/12/07 09:39:49 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: d783cb55aa1ff17682c335cc073d7a06)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1670402552419_0002_01_000002(beam-loadtests-go-combine-flink-batch-735-w-0.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/12/07 09:39:49 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1670402552419_0002_01_000002(beam-loadtests-go-combine-flink-batch-735-w-0.c.apache-beam-testing.internal:8026) timed out.
2022/12/07 09:39:49 Job state: FAILED
2022/12/07 09:39:49 Failed to execute job: job load0tests0go0flink0batch0combine0101207065321-root-1207084553-f1233ef_f94ce1a6-ad25-47a5-8479-69eb16d2af84 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101207065321-root-1207084553-f1233ef_f94ce1a6-ad25-47a5-8479-69eb16d2af84 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1683728, 0xc000128000}, {0x14e498b?, 0x20447d8?}, {0xc000299e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 54m 36s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/yy5lq4xd4tvcu
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #734
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/734/display/redirect?page=changes>
Changes:
[noreply] Update nbconvert requirement in /sdks/python
[noreply] [CdapIO] Add readme for CdapIO. Update readme for SparkReceiverIO.
[Moritz Mack] [Spark Dataset runner] Add @Experimental and reduce visibility where
[noreply] Fix grafana dashboard id (#24524)
[noreply] [Spark runner] Support running (VR) tests with Java 17 (closes #24400)
[noreply] Replaced deprecated finalize with DoFn Teardown (#24516)
[noreply] Bump cloud.google.com/go/storage from 1.28.0 to 1.28.1 in /sdks (#24517)
[noreply] add clarifier to error message (#24449)
[noreply] Batch rename requests in fileio.WriteToFiles (#24341)
[noreply] Bump golang.org/x/text from 0.4.0 to 0.5.0 in /sdks (#24520)
[noreply] Support for JsonSchema in Kafka Read Schema Transform (#24272)
[noreply] Run go fmt over full go directory with go 1.19 (#24525)
[noreply] Cloudbuild+manualsetup+playground (#24144)
[noreply] Bump golang.org/x/sys from 0.2.0 to 0.3.0 in /sdks (#24519)
[noreply] Bump cloud.google.com/go/bigtable from 1.18.0 to 1.18.1 in /sdks
[noreply] Update from interface{} -> any for core packages (#24505)
[noreply] Implement FileWriteSchemaTransformConfiguration (#24479)
[noreply] Bump cloud.google.com/go/pubsub from 1.27.0 to 1.27.1 in /sdks (#24518)
[noreply] [Playground] Healthcheck was added (#24227)
[noreply] Update dataflow container version for Pandas upgrade (#24532)
------------------------------------------
[...truncated 34.10 KB...]
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:coder:iterable:v1"
>
component_coder_ids: "c0"
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c6"
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c7"
component_coder_ids: "c8"
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/12/06 08:45:26 Using specified **** binary: 'linux_amd64/combine'
2022/12/06 08:45:26 Prepared job with id: load-tests-go-flink-batch-combine-1-1206065319_d25e72ba-3688-480f-ac00-1002bf1e05f7 and staging token: load-tests-go-flink-batch-combine-1-1206065319_d25e72ba-3688-480f-ac00-1002bf1e05f7
2022/12/06 08:45:32 Staged binary artifact with token:
2022/12/06 08:45:33 Submitted job: load0tests0go0flink0batch0combine0101206065319-root-1206084532-7bd64a21_650a6f87-6b5f-4552-9f0f-b8448656f04d
2022/12/06 08:45:33 Job state: STOPPED
2022/12/06 08:45:33 Job state: STARTING
2022/12/06 08:45:33 Job state: RUNNING
2022/12/06 09:40:06 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: c826aa108fd6afe5f1224cc89b27637d)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyTargetUnreachable(JobMaster.java:1387)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.reportHeartbeatRpcFailure(HeartbeatMonitorImpl.java:123)
at org.apache.flink.runtime.heartbeat.HeartbeatManagerImpl.runIfHeartbeatMonitorExists(HeartbeatManagerImpl.java:275)
at org.apache.flink.runtime.heartbeat.HeartbeatManagerImpl.reportHeartbeatTargetUnreachable(HeartbeatManagerImpl.java:267)
at org.apache.flink.runtime.heartbeat.HeartbeatManagerImpl.handleHeartbeatRpcFailure(HeartbeatManagerImpl.java:262)
at org.apache.flink.runtime.heartbeat.HeartbeatManagerImpl.lambda$handleHeartbeatRpc$0(HeartbeatManagerImpl.java:248)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:478)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: org.apache.flink.runtime.jobmaster.JobMasterException: TaskManager with id container_1670316138886_0001_01_000003(beam-loadtests-go-combine-flink-batch-734-w-3.c.apache-beam-testing.internal:8026) is no longer reachable.
... 36 more
2022/12/06 09:40:06 (): org.apache.flink.runtime.jobmaster.JobMasterException: TaskManager with id container_1670316138886_0001_01_000003(beam-loadtests-go-combine-flink-batch-734-w-3.c.apache-beam-testing.internal:8026) is no longer reachable.
2022/12/06 09:40:06 Job state: FAILED
2022/12/06 09:40:06 Failed to execute job: job load0tests0go0flink0batch0combine0101206065319-root-1206084532-7bd64a21_650a6f87-6b5f-4552-9f0f-b8448656f04d failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101206065319-root-1206084532-7bd64a21_650a6f87-6b5f-4552-9f0f-b8448656f04d failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x16834e8, 0xc000128000}, {0x14e47ab?, 0x20447d8?}, {0xc0006d1e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 55m 20s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/zshjrswyzwe2a
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #733
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/733/display/redirect?page=changes>
Changes:
[noreply] Bump github.com/aws/aws-sdk-go-v2/feature/s3/manager in /sdks (#24501)
------------------------------------------
[...truncated 34.13 KB...]
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c10"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:coder:iterable:v1"
>
component_coder_ids: "c0"
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c6"
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c7"
component_coder_ids: "c8"
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/12/05 08:45:31 Using specified **** binary: 'linux_amd64/combine'
2022/12/05 08:45:32 Prepared job with id: load-tests-go-flink-batch-combine-1-1205065320_fcc6c736-f415-47a0-8ef2-306244d61255 and staging token: load-tests-go-flink-batch-combine-1-1205065320_fcc6c736-f415-47a0-8ef2-306244d61255
2022/12/05 08:45:42 Staged binary artifact with token:
2022/12/05 08:45:43 Submitted job: load0tests0go0flink0batch0combine0101205065320-root-1205084542-8f9b9ea3_f4db172f-e0ce-475d-abf3-520ed066fa15
2022/12/05 08:45:43 Job state: STOPPED
2022/12/05 08:45:43 Job state: STARTING
2022/12/05 08:45:43 Job state: RUNNING
2022/12/05 09:39:45 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 618a48b32f7bbe91a91a4b6b51297dcc)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1670229752532_0002_01_000002(beam-loadtests-go-combine-flink-batch-733-w-3.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/12/05 09:39:45 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1670229752532_0002_01_000002(beam-loadtests-go-combine-flink-batch-733-w-3.c.apache-beam-testing.internal:8026) timed out.
2022/12/05 09:39:45 Job state: FAILED
2022/12/05 09:39:45 Failed to execute job: job load0tests0go0flink0batch0combine0101205065320-root-1205084542-8f9b9ea3_f4db172f-e0ce-475d-abf3-520ed066fa15 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101205065320-root-1205084542-8f9b9ea3_f4db172f-e0ce-475d-abf3-520ed066fa15 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1683868, 0xc00004e0c0}, {0x14e4b2b?, 0x2044978?}, {0xc0003bbe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 54m 36s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/n6bfsrrgxhf6i
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #732
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/732/display/redirect?page=changes>
Changes:
[noreply] Bump github.com/aws/aws-sdk-go-v2/service/s3 in /sdks (#24502)
[noreply] Adding support for PubSub Lite in Schema Transforms (#24275)
[noreply] Fix mispositioned producer_type_hints and producer_batch_converter
[noreply] Update REVIEWERS.yml (#24507)
------------------------------------------
[...truncated 34.01 KB...]
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c10"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:coder:iterable:v1"
>
component_coder_ids: "c0"
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c6"
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c7"
component_coder_ids: "c8"
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/12/04 08:45:24 Using specified **** binary: 'linux_amd64/combine'
2022/12/04 08:45:25 Prepared job with id: load-tests-go-flink-batch-combine-1-1204065328_fca5766d-df9b-4c54-9f18-3256f0df2f73 and staging token: load-tests-go-flink-batch-combine-1-1204065328_fca5766d-df9b-4c54-9f18-3256f0df2f73
2022/12/04 08:45:34 Staged binary artifact with token:
2022/12/04 08:45:36 Submitted job: load0tests0go0flink0batch0combine0101204065328-root-1204084534-743ebb2f_15f700dc-b05a-4dd7-9491-cae039fc2c0d
2022/12/04 08:45:37 Job state: STOPPED
2022/12/04 08:45:37 Job state: STARTING
2022/12/04 08:45:37 Job state: RUNNING
2022/12/04 09:39:52 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 5befb113ab0b87b462a88649bd291f3b)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1670143348807_0002_01_000004(beam-loadtests-go-combine-flink-batch-732-w-3.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/12/04 09:39:52 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1670143348807_0002_01_000004(beam-loadtests-go-combine-flink-batch-732-w-3.c.apache-beam-testing.internal:8026) timed out.
2022/12/04 09:39:52 Job state: FAILED
2022/12/04 09:39:52 Failed to execute job: job load0tests0go0flink0batch0combine0101204065328-root-1204084534-743ebb2f_15f700dc-b05a-4dd7-9491-cae039fc2c0d failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101204065328-root-1204084534-743ebb2f_15f700dc-b05a-4dd7-9491-cae039fc2c0d failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1683868, 0xc00004e0c0}, {0x14e4b2b?, 0x2044978?}, {0xc000677e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 54m 52s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/l7k3jbsntnwma
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #731
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/731/display/redirect?page=changes>
Changes:
[Xinyu Liu] Disable flaky async tests in samza runner
[noreply] Bump cloud.google.com/go/spanner from 1.40.0 to 1.41.0 in /sdks (#24483)
[noreply] Revert "Support SqlTypes Date and Timestamp (MicrosInstant) in AvroUtils
[noreply] Bump actions/setup-java from 3.6.0 to 3.7.0 (#24484)
[noreply] Add GPU benchmark to RunInference Grafana dashboard (#24485)
[Yi Hu] Add back javadoc
[noreply] [CdapIO] Update read timeout for SparkReceiverIOIT job (#24428)
[noreply] Bump cloud.google.com/go/datastore from 1.9.0 to 1.10.0 in /sdks
[noreply] Fix broken typescript tests
[noreply] Reduce version matrix of build wheel on pull request event (#24448)
[noreply] [Spark dataset runner] Make sure PCollection views get only broadcasted
[noreply] Fix for #22951 w/ GroupIntoBatchesOverride fix (#24463)
[noreply] Never mark issues as stale (#24494)
[noreply] [BigQueryIO] Use final destination table schema and metadata when
[noreply] Minor visibility improvements to render runner. (#24480)
[noreply] [Go SDK]: Infer field names from struct tags (#24473)
[noreply] Bump github.com/go-sql-driver/mysql from 1.6.0 to 1.7.0 in /sdks
[noreply] [Website] add coauthor option to case-studies #24486 (#24475)
[noreply] Add panel pointing to docs for ml benchmarks (#24503)
[noreply] Adding Beam Schemas capability to parse json-schemas. This is the de-…
[noreply] Bump tensorflow from 2.9.1 to 2.9.3 in
[noreply] [SQL extension] Minor fixes to logging applying best practices (#24328)
[noreply] Apply task configuration avoidance (#24509)
[noreply] Bump cloud.google.com/go/profiler from 0.3.0 to 0.3.1 in /sdks (#24498)
[relax] fix exception
------------------------------------------
[...truncated 34.14 KB...]
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c10"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:coder:iterable:v1"
>
component_coder_ids: "c0"
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c6"
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c7"
component_coder_ids: "c8"
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/12/03 08:45:48 Using specified **** binary: 'linux_amd64/combine'
2022/12/03 08:45:49 Prepared job with id: load-tests-go-flink-batch-combine-1-1203070842_c2557cb0-71f8-4700-9307-522e80a7d860 and staging token: load-tests-go-flink-batch-combine-1-1203070842_c2557cb0-71f8-4700-9307-522e80a7d860
2022/12/03 08:45:54 Staged binary artifact with token:
2022/12/03 08:45:56 Submitted job: load0tests0go0flink0batch0combine0101203070842-root-1203084555-f21a3253_661de8d4-ff54-4cbb-b261-9fd78536601c
2022/12/03 08:45:56 Job state: STOPPED
2022/12/03 08:45:56 Job state: STARTING
2022/12/03 08:45:56 Job state: RUNNING
2022/12/03 09:40:46 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: e4b8db90f3600f571672c25ae870a666)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1670056955677_0002_01_000004(beam-loadtests-go-combine-flink-batch-731-w-2.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/12/03 09:40:46 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1670056955677_0002_01_000004(beam-loadtests-go-combine-flink-batch-731-w-2.c.apache-beam-testing.internal:8026) timed out.
2022/12/03 09:40:46 Job state: FAILED
2022/12/03 09:40:46 Failed to execute job: job load0tests0go0flink0batch0combine0101203070842-root-1203084555-f21a3253_661de8d4-ff54-4cbb-b261-9fd78536601c failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101203070842-root-1203084555-f21a3253_661de8d4-ff54-4cbb-b261-9fd78536601c failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1683868, 0xc0001a2000}, {0x14e4b2b?, 0x2044978?}, {0xc000457e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 55m 29s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/hlhs7yyana7eu
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #730
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/730/display/redirect?page=changes>
Changes:
[noreply] Bump cloud.google.com/go/bigquery from 1.43.0 to 1.44.0 in /sdks
[noreply] ML notebook formatting and text updates (#24437)
[noreply] lint fixes (#24455)
[noreply] Bump cloud.google.com/go/pubsub from 1.26.0 to 1.27.0 in /sdks (#24450)
[noreply] Install venv dependencies in local env setup (#24461)
[noreply] Don't set BigQuery services in schema transform configuration (#24316)
[noreply] Apache playground blog (#24431)
[noreply] Sort SchemaTransform configuration schema fields by name to establish
[noreply] Rename from default-pool to pool-1 (#24466)
[Kenneth Knowles] Moving to 2.45.0-SNAPSHOT on master branch.
[noreply] [Website] change svg to png (#24268)
[noreply] Fix error messages in cred rotation email (#24474)
[noreply] Add integer to NUMERIC and BIGNUMERIC conversion support (#24447)
[noreply] Update jackson dep. (#24445)
[noreply] Reduce calls to FileSystem.match and API calls in FileSystem._list
[noreply] Capture full response context to provide complete error information
[noreply] Pandas 1.5 support (#23973)
------------------------------------------
[...truncated 34.17 KB...]
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c10"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:coder:iterable:v1"
>
component_coder_ids: "c0"
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c6"
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c7"
component_coder_ids: "c8"
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/12/02 08:46:04 Using specified **** binary: 'linux_amd64/combine'
2022/12/02 08:46:04 Prepared job with id: load-tests-go-flink-batch-combine-1-1202065330_c6ca08dc-3f03-446e-a8c4-91f51299de9e and staging token: load-tests-go-flink-batch-combine-1-1202065330_c6ca08dc-3f03-446e-a8c4-91f51299de9e
2022/12/02 08:46:10 Staged binary artifact with token:
2022/12/02 08:46:11 Submitted job: load0tests0go0flink0batch0combine0101202065330-root-1202084610-961f79a1_1292d958-847d-46e1-83a1-2c46aaf0a294
2022/12/02 08:46:11 Job state: STOPPED
2022/12/02 08:46:11 Job state: STARTING
2022/12/02 08:46:11 Job state: RUNNING
2022/12/02 09:41:04 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: d9b426e5e74caece745bc5e7d8bba114)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669970565331_0001_01_000005(beam-loadtests-go-combine-flink-batch-730-w-4.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/12/02 09:41:04 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669970565331_0001_01_000005(beam-loadtests-go-combine-flink-batch-730-w-4.c.apache-beam-testing.internal:8026) timed out.
2022/12/02 09:41:04 Job state: FAILED
2022/12/02 09:41:04 Failed to execute job: job load0tests0go0flink0batch0combine0101202065330-root-1202084610-961f79a1_1292d958-847d-46e1-83a1-2c46aaf0a294 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101202065330-root-1202084610-961f79a1_1292d958-847d-46e1-83a1-2c46aaf0a294 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1683868, 0xc00004e0c0}, {0x14e4bb5?, 0x2044978?}, {0xc000025e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 55m 33s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/rdigkity4opl4
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #729
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/729/display/redirect?page=changes>
Changes:
[bulat.safiullin] [Website] update copy-to-clipboard.js #24372
[noreply] Bump google.golang.org/grpc from 1.50.1 to 1.51.0 in /sdks (#24281)
[noreply] [Playground] use JAVA SDK 2.43.0 in Examples CI (#24429)
[noreply] Update authors.yml (#24433)
[noreply] Bump cloud.google.com/go/spanner from 1.36.0 to 1.40.0 in /sdks (#24423)
[noreply] Add Large Language Model RunInference Example (#24350)
[noreply] [Playground] [Backend] minor fixes for error msgs (#23999)
[noreply] pg_24284_now_closing_parenthesis on cancel button is visible (#24327)
[noreply] [Github Actions] - Cut Release Branch Workflow (#24020)
[noreply] Add six to build-requirements.txt (#24434)
[noreply] Add Pytorch RunInference GPU benchmark (#24347)
[noreply] Fix multiple mutations affecting the same entity in Datastore write
[noreply] Fix BlobstorageIO.checksum Attribute Error (#24442)
[noreply] Bump github.com/tetratelabs/wazero in /sdks (#24453)
[noreply] [BEAM-12164] Support querying against Postgres for the SpannerIO change
------------------------------------------
[...truncated 34.24 KB...]
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c10"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:coder:iterable:v1"
>
component_coder_ids: "c0"
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c6"
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c7"
component_coder_ids: "c8"
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/12/01 08:45:57 Using specified **** binary: 'linux_amd64/combine'
2022/12/01 08:45:58 Prepared job with id: load-tests-go-flink-batch-combine-1-1201065337_c2412b95-8529-420a-b8a5-6eb86d7c1032 and staging token: load-tests-go-flink-batch-combine-1-1201065337_c2412b95-8529-420a-b8a5-6eb86d7c1032
2022/12/01 08:46:03 Staged binary artifact with token:
2022/12/01 08:46:05 Submitted job: load0tests0go0flink0batch0combine0101201065337-root-1201084604-352596c_c910000d-0b38-4b9f-a256-cd90e2da0a33
2022/12/01 08:46:05 Job state: STOPPED
2022/12/01 08:46:05 Job state: STARTING
2022/12/01 08:46:05 Job state: RUNNING
2022/12/01 09:40:26 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: b6f3083a537990e0a149d42a06795b5b)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669884156025_0001_01_000002(beam-loadtests-go-combine-flink-batch-729-w-2.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/12/01 09:40:26 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669884156025_0001_01_000002(beam-loadtests-go-combine-flink-batch-729-w-2.c.apache-beam-testing.internal:8026) timed out.
2022/12/01 09:40:26 Job state: FAILED
2022/12/01 09:40:26 Failed to execute job: job load0tests0go0flink0batch0combine0101201065337-root-1201084604-352596c_c910000d-0b38-4b9f-a256-cd90e2da0a33 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101201065337-root-1201084604-352596c_c910000d-0b38-4b9f-a256-cd90e2da0a33 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x167b7a8, 0xc000130000}, {0x14dd405?, 0x20349e8?}, {0xc000169e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 55m 3s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/uv34kk5oofo5a
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #728
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/728/display/redirect?page=changes>
Changes:
[Kenneth Knowles] Fix dependencies of archetype tasks
[Kenneth Knowles] Upgrade checker framework to 3.13.0
[Kenneth Knowles] Upgrade checker framework to 3.14.0
[Kenneth Knowles] Upgrade checker framework to 3.15.0
[Kenneth Knowles] Inline :sdks:java:core:buildDependents so we can incrementally split
[Moritz Mack] [Spark Dataset runner] Fix support for Java 11 (closes #24392)
[Moritz Mack] fix spotless
[noreply] Fix SparkReceiverIOIT test (#24375)
[noreply] Bump cloud.google.com/go/bigquery from 1.42.0 to 1.43.0 in /sdks
[noreply] Bump github.com/aws/aws-sdk-go-v2/feature/s3/manager in /sdks (#24348)
[noreply] pg_23079 remove replacing tabs at playground (#24285)
[noreply] [#24339] Make Slices use iterable coder instead of custom coder.
[noreply] Add custom inference fns to CHANGES.md (#24412)
[noreply] Better warning and Exception message in CalciteUtil (#24414)
[noreply] List breaking change #24339 in Changes.md (#24420)
[noreply] Allow composite output types in sql.Transform. (#24421)
[noreply] Add map_windows support to Go SDK (#24307)
[noreply] Deleted initialNumReaders paramter. (#24355)
------------------------------------------
[...truncated 33.96 KB...]
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c10"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:coder:iterable:v1"
>
component_coder_ids: "c0"
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c6"
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c7"
component_coder_ids: "c8"
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/30 08:47:22 Using specified **** binary: 'linux_amd64/combine'
2022/11/30 08:47:23 Prepared job with id: load-tests-go-flink-batch-combine-1-1130065331_2db3b87f-1c89-4e92-b7d3-fd174225a674 and staging token: load-tests-go-flink-batch-combine-1-1130065331_2db3b87f-1c89-4e92-b7d3-fd174225a674
2022/11/30 08:47:29 Staged binary artifact with token:
2022/11/30 08:47:31 Submitted job: load0tests0go0flink0batch0combine0101130065331-root-1130084730-91271459_78cddc50-b04b-4a2a-9125-7649ce8ba587
2022/11/30 08:47:31 Job state: STOPPED
2022/11/30 08:47:31 Job state: STARTING
2022/11/30 08:47:31 Job state: RUNNING
2022/11/30 09:41:59 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 2dc481b01ba95591376812d7ecb30cec)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669797820037_0001_01_000004(beam-loadtests-go-combine-flink-batch-728-w-0.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/11/30 09:41:59 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669797820037_0001_01_000004(beam-loadtests-go-combine-flink-batch-728-w-0.c.apache-beam-testing.internal:8026) timed out.
2022/11/30 09:42:00 Job state: FAILED
2022/11/30 09:42:00 Failed to execute job: job load0tests0go0flink0batch0combine0101130065331-root-1130084730-91271459_78cddc50-b04b-4a2a-9125-7649ce8ba587 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101130065331-root-1130084730-91271459_78cddc50-b04b-4a2a-9125-7649ce8ba587 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1679a68, 0xc000128000}, {0x14db870?, 0x2031968?}, {0xc00029be70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 55m 34s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/b7wopx45x2tbq
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #727
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/727/display/redirect?page=changes>
Changes:
[Andrew Pilloud] Handle CompleteWorkStatus shutdown signal
[Kenneth Knowles] Upgrade checkerframework gradle plugin to 0.6.19
[Kenneth Knowles] Check for null in BeamFnDataGrpcMultiplexer
[Kenneth Knowles] Upgrade checkerframework to 3.12.0
[Andrew Pilloud] Simplify sdks/java/harness build
[Andrew Pilloud] Move configuration changes before shadowJar
[noreply] [Tour Of Beam] persistence_key for Pg::SaveSnippet (#24287)
[noreply] Get postcommits green and unsickbay (#24342)
[noreply] Fix workflow cron syntax (#24376)
[noreply] concurrency (#24332)
[Andrew Pilloud] Exclude :sdks:java:core from harness jar
[Andrew Pilloud] Enable shadowJar validation for sdks/java/harness
[Andrew Pilloud] Add missing portability runner dependencies
[noreply] Revert "Force discarding mode in with_fanout without rewindowing."
[Andrew Pilloud] Exclude jamm from harness jar
[Andrew Pilloud] Enforce GCP BOM on sdks/java/harness
[noreply] Bump pillow from 9.2.0 to 9.3.0 in
[noreply] Update precombine bencmark to better represent varied workloads (#24343)
[noreply] Merge pull request #24320: update bom to the latest one
[noreply] Merge pull request #24147: First step in adding schema update to Storage
------------------------------------------
[...truncated 34.54 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/29 08:45:40 Using specified **** binary: 'linux_amd64/combine'
2022/11/29 08:45:41 Prepared job with id: load-tests-go-flink-batch-combine-1-1129065322_c3b855c6-4cf7-4c35-827e-9cc7bb91e73b and staging token: load-tests-go-flink-batch-combine-1-1129065322_c3b855c6-4cf7-4c35-827e-9cc7bb91e73b
2022/11/29 08:45:51 Staged binary artifact with token:
2022/11/29 08:45:53 Submitted job: load0tests0go0flink0batch0combine0101129065322-root-1129084551-e4ef67f3_2cbed2dc-67d9-43d4-8ed5-40381f26de35
2022/11/29 08:45:53 Job state: STOPPED
2022/11/29 08:45:53 Job state: STARTING
2022/11/29 08:45:53 Job state: RUNNING
2022/11/29 09:41:05 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: ab195d7342bc47b2d739864b30ce7be1)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669711364140_0002_01_000003(beam-loadtests-go-combine-flink-batch-727-w-2.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/11/29 09:41:05 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669711364140_0002_01_000003(beam-loadtests-go-combine-flink-batch-727-w-2.c.apache-beam-testing.internal:8026) timed out.
2022/11/29 09:41:05 Job state: FAILED
2022/11/29 09:41:05 Failed to execute job: job load0tests0go0flink0batch0combine0101129065322-root-1129084551-e4ef67f3_2cbed2dc-67d9-43d4-8ed5-40381f26de35 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101129065322-root-1129084551-e4ef67f3_2cbed2dc-67d9-43d4-8ed5-40381f26de35 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1678408, 0xc00004e0c0}, {0x14da42d?, 0x202f968?}, {0xc000249e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 55m 49s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/rsxwvm5llgrxs
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #726
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/726/display/redirect>
Changes:
------------------------------------------
[...truncated 34.65 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/28 08:45:55 Using specified **** binary: 'linux_amd64/combine'
2022/11/28 08:45:55 Prepared job with id: load-tests-go-flink-batch-combine-1-1128065321_7e006697-e334-466b-abda-fde0d75c9d82 and staging token: load-tests-go-flink-batch-combine-1-1128065321_7e006697-e334-466b-abda-fde0d75c9d82
2022/11/28 08:46:02 Staged binary artifact with token:
2022/11/28 08:46:03 Submitted job: load0tests0go0flink0batch0combine0101128065321-root-1128084602-27db3aef_f15f9e12-2cc7-4fc7-b899-f2f47f946dcd
2022/11/28 08:46:03 Job state: STOPPED
2022/11/28 08:46:04 Job state: STARTING
2022/11/28 08:46:04 Job state: RUNNING
2022/11/28 09:40:16 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: ac25f0168013d4ad3aeb38caa3f028a4)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669624969781_0001_01_000002(beam-loadtests-go-combine-flink-batch-726-w-2.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/11/28 09:40:16 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669624969781_0001_01_000002(beam-loadtests-go-combine-flink-batch-726-w-2.c.apache-beam-testing.internal:8026) timed out.
2022/11/28 09:40:16 Job state: FAILED
2022/11/28 09:40:16 Failed to execute job: job load0tests0go0flink0batch0combine0101128065321-root-1128084602-27db3aef_f15f9e12-2cc7-4fc7-b899-f2f47f946dcd failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101128065321-root-1128084602-27db3aef_f15f9e12-2cc7-4fc7-b899-f2f47f946dcd failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1678408, 0xc00004e0c0}, {0x14da42d?, 0x202f968?}, {0xc0005c7e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 54m 43s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/gxkjkxwa6wxgw
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #725
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/725/display/redirect>
Changes:
------------------------------------------
[...truncated 34.60 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/27 08:45:44 Using specified **** binary: 'linux_amd64/combine'
2022/11/27 08:45:45 Prepared job with id: load-tests-go-flink-batch-combine-1-1127065326_a6b1e6b5-3c8b-4fd3-8fe3-d76baf5d12cf and staging token: load-tests-go-flink-batch-combine-1-1127065326_a6b1e6b5-3c8b-4fd3-8fe3-d76baf5d12cf
2022/11/27 08:45:54 Staged binary artifact with token:
2022/11/27 08:45:57 Submitted job: load0tests0go0flink0batch0combine0101127065326-root-1127084555-158950c1_7c6f75e3-04c9-4612-9d8b-7295a28ec887
2022/11/27 08:45:57 Job state: STOPPED
2022/11/27 08:45:57 Job state: STARTING
2022/11/27 08:45:57 Job state: RUNNING
2022/11/27 09:41:28 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: fa558879133c1d3e2412650d4887037f)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669538569243_0002_01_000003(beam-loadtests-go-combine-flink-batch-725-w-1.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/11/27 09:41:28 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669538569243_0002_01_000003(beam-loadtests-go-combine-flink-batch-725-w-1.c.apache-beam-testing.internal:8026) timed out.
2022/11/27 09:41:28 Job state: FAILED
2022/11/27 09:41:28 Failed to execute job: job load0tests0go0flink0batch0combine0101127065326-root-1127084555-158950c1_7c6f75e3-04c9-4612-9d8b-7295a28ec887 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101127065326-root-1127084555-158950c1_7c6f75e3-04c9-4612-9d8b-7295a28ec887 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1678408, 0xc00004e0c0}, {0x14da42d?, 0x202f968?}, {0xc000025e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 56m 6s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/hsaxecjwucxzw
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #724
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/724/display/redirect?page=changes>
Changes:
[noreply] Update java-multi-language-pipelines.md (#24345)
------------------------------------------
[...truncated 34.66 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/26 08:45:49 Using specified **** binary: 'linux_amd64/combine'
2022/11/26 08:45:49 Prepared job with id: load-tests-go-flink-batch-combine-1-1126065321_b0d1e4f5-49a0-4e15-93e6-1a1b126ce48d and staging token: load-tests-go-flink-batch-combine-1-1126065321_b0d1e4f5-49a0-4e15-93e6-1a1b126ce48d
2022/11/26 08:45:55 Staged binary artifact with token:
2022/11/26 08:45:56 Submitted job: load0tests0go0flink0batch0combine0101126065321-root-1126084555-bfdc2538_768efa6c-7f75-4d92-a4e3-39f68143fc13
2022/11/26 08:45:56 Job state: STOPPED
2022/11/26 08:45:56 Job state: STARTING
2022/11/26 08:45:56 Job state: RUNNING
2022/11/26 09:40:56 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 0c88f48560e4b3fd02d3bba9055aa896)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669452159857_0001_01_000002(beam-loadtests-go-combine-flink-batch-724-w-0.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/11/26 09:40:56 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669452159857_0001_01_000002(beam-loadtests-go-combine-flink-batch-724-w-0.c.apache-beam-testing.internal:8026) timed out.
2022/11/26 09:40:56 Job state: FAILED
2022/11/26 09:40:56 Failed to execute job: job load0tests0go0flink0batch0combine0101126065321-root-1126084555-bfdc2538_768efa6c-7f75-4d92-a4e3-39f68143fc13 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101126065321-root-1126084555-bfdc2538_768efa6c-7f75-4d92-a4e3-39f68143fc13 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1678408, 0xc00004e0c0}, {0x14da42d?, 0x202f968?}, {0xc0002a1e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 55m 29s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/emuz37mzvpljs
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #723
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/723/display/redirect>
Changes:
------------------------------------------
[...truncated 34.59 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/25 08:45:32 Using specified **** binary: 'linux_amd64/combine'
2022/11/25 08:45:33 Prepared job with id: load-tests-go-flink-batch-combine-1-1125065326_6bb892a3-0bef-4110-8c50-22fa4fd89781 and staging token: load-tests-go-flink-batch-combine-1-1125065326_6bb892a3-0bef-4110-8c50-22fa4fd89781
2022/11/25 08:45:42 Staged binary artifact with token:
2022/11/25 08:45:45 Submitted job: load0tests0go0flink0batch0combine0101125065326-root-1125084543-d17c2739_a250df14-b9a5-4d03-be7e-5f8be1a3ff6f
2022/11/25 08:45:45 Job state: STOPPED
2022/11/25 08:45:45 Job state: STARTING
2022/11/25 08:45:45 Job state: RUNNING
2022/11/25 09:40:17 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 2d28e614441deafbfde12d01078774a8)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669365754362_0002_01_000003(beam-loadtests-go-combine-flink-batch-723-w-0.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/11/25 09:40:17 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669365754362_0002_01_000003(beam-loadtests-go-combine-flink-batch-723-w-0.c.apache-beam-testing.internal:8026) timed out.
2022/11/25 09:40:17 Job state: FAILED
2022/11/25 09:40:17 Failed to execute job: job load0tests0go0flink0batch0combine0101125065326-root-1125084543-d17c2739_a250df14-b9a5-4d03-be7e-5f8be1a3ff6f failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101125065326-root-1125084543-d17c2739_a250df14-b9a5-4d03-be7e-5f8be1a3ff6f failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1678408, 0xc000128000}, {0x14da42d?, 0x202f968?}, {0xc00033fe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 55m 6s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/3jtm4zafaijae
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #722
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/722/display/redirect?page=changes>
Changes:
[bulat.safiullin] [Website] update table text content overflow #23460
[Moritz Mack] [Spark dataset runner] Fix translation to run in the evaluation thread
[Moritz Mack] [Metrics] Add 'performance tests' tag to JMH dashboard (related to
[noreply] Bump github.com/aws/aws-sdk-go-v2/credentials in /sdks (#24318)
[noreply] Update apache beam installation in notebook (#24336)
[noreply] Adds GCP core dependency to the test expansion service (#24308)
[noreply] Update dataflow containers to coincide with objsize 0.6.1 update
[noreply] Add test configurations for deterministic outputs on Dataflow (#24325)
[noreply] Updates ExpansionService to support dynamically discovering and
[noreply] Enable streaming runner v2 tests that were forgotten to be enabled.
[noreply] A schema transform implementation for SpannerIO.Write (#24278)
------------------------------------------
[...truncated 34.64 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/24 08:45:57 Using specified **** binary: 'linux_amd64/combine'
2022/11/24 08:45:57 Prepared job with id: load-tests-go-flink-batch-combine-1-1124065327_54e9b26c-c68f-4a4d-9730-375d1e959400 and staging token: load-tests-go-flink-batch-combine-1-1124065327_54e9b26c-c68f-4a4d-9730-375d1e959400
2022/11/24 08:46:03 Staged binary artifact with token:
2022/11/24 08:46:05 Submitted job: load0tests0go0flink0batch0combine0101124065327-root-1124084603-7f8dd35e_5aa94d2e-bc0e-4b22-b0ea-aa54d70bb6e5
2022/11/24 08:46:05 Job state: STOPPED
2022/11/24 08:46:05 Job state: STARTING
2022/11/24 08:46:05 Job state: RUNNING
2022/11/24 09:40:25 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: c8a98bf85ba9651cd0438406abeea1c4)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669279377632_0001_01_000003(beam-loadtests-go-combine-flink-batch-722-w-4.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/11/24 09:40:25 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669279377632_0001_01_000003(beam-loadtests-go-combine-flink-batch-722-w-4.c.apache-beam-testing.internal:8026) timed out.
2022/11/24 09:40:25 Job state: FAILED
2022/11/24 09:40:25 Failed to execute job: job load0tests0go0flink0batch0combine0101124065327-root-1124084603-7f8dd35e_5aa94d2e-bc0e-4b22-b0ea-aa54d70bb6e5 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101124065327-root-1124084603-7f8dd35e_5aa94d2e-bc0e-4b22-b0ea-aa54d70bb6e5 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1678408, 0xc000128000}, {0x14da42d?, 0x202f968?}, {0xc000351e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 54m 51s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/4eoyjx75qhi7m
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #721
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/721/display/redirect?page=changes>
Changes:
[bulat.safiullin] [Website] change share-your-story link, add text to ADD_CASE_STUDY.md,
[Robert Bradshaw] Add a portable runner that renders pipelines as a dot graph.
[Robert Bradshaw] Add basic tests for render runner.
[Robert Bradshaw] Add the ability to pass a pipeline proto directly.
[Robert Bradshaw] lint
[mr.malarg] pg_23865 fix selected example at list
[leha] Expand all categories that contain a selected example (#23865)
[Valentyn Tymofieiev] Fix typo.
[Valentyn Tymofieiev] Serve the graph when output file is not specified.
[Valentyn Tymofieiev] Serve the graph when output file is not specified.
[Valentyn Tymofieiev] Fix parsing of standalone protos.
[Valentyn Tymofieiev] Support reading from GCS.
[Valentyn Tymofieiev] Add text logging.
[Valentyn Tymofieiev] fix typo.
[Valentyn Tymofieiev] Some lint and yapf.
[Robert Bradshaw] Fix dot detection logic.
[Robert Bradshaw] fix error detected by lint
[Robert Bradshaw] Make gcs an optional dependency.
[Robert Bradshaw] return rather than sys.exit
[kn1kn1] Fix mvn command to refer the GCP_REGION variable
[Robert Bradshaw] lint
[noreply] Bump github.com/aws/aws-sdk-go-v2/feature/s3/manager in /sdks (#24280)
[noreply] Copy editing the machine learning pages (#24301)
[noreply] TensorRT Custom Inference Function Implementation (#24039)
[noreply] Teach Azure Filesystem to authenticate using DefaultAzureCredential in
[noreply] Apply suggestions from code review
[noreply] Add retry to test connections (#23757)
[Robert Bradshaw] More cleanup, mypy.
[noreply] fixed typo
[noreply] [#24266] Update release candidate script to use -PisRelease (#24269)
[noreply] Golang SpannerIO Implementation (#23285)
[noreply] Add rootCaCertificate option to SplunkIO (#24229)
[noreply] [Playground] Remove example bucket (#24198)
[noreply] Extract Go and Python Beam symbols for Playground (#23378)
[noreply] Dask runner tests action (#24324)
[Robert Bradshaw] lint
------------------------------------------
[...truncated 34.61 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/23 08:45:41 Using specified **** binary: 'linux_amd64/combine'
2022/11/23 08:45:42 Prepared job with id: load-tests-go-flink-batch-combine-1-1123065339_c06339a7-b2c9-4478-8b3c-d4794985ce55 and staging token: load-tests-go-flink-batch-combine-1-1123065339_c06339a7-b2c9-4478-8b3c-d4794985ce55
2022/11/23 08:45:48 Staged binary artifact with token:
2022/11/23 08:45:50 Submitted job: load0tests0go0flink0batch0combine0101123065339-root-1123084548-2f7b339f_cbe92dcc-e903-45a1-b344-ccd44225b941
2022/11/23 08:45:50 Job state: STOPPED
2022/11/23 08:45:50 Job state: STARTING
2022/11/23 08:45:50 Job state: RUNNING
2022/11/23 09:40:42 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 3491ede2bea633c070cf91bc7e1e7a2e)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669192960749_0001_01_000004(beam-loadtests-go-combine-flink-batch-721-w-4.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/11/23 09:40:42 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669192960749_0001_01_000004(beam-loadtests-go-combine-flink-batch-721-w-4.c.apache-beam-testing.internal:8026) timed out.
2022/11/23 09:40:43 Job state: FAILED
2022/11/23 09:40:43 Failed to execute job: job load0tests0go0flink0batch0combine0101123065339-root-1123084548-2f7b339f_cbe92dcc-e903-45a1-b344-ccd44225b941 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101123065339-root-1123084548-2f7b339f_cbe92dcc-e903-45a1-b344-ccd44225b941 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1678408, 0xc000128000}, {0x14da42d?, 0x202f968?}, {0xc000297e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 55m 23s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/7lryl7y2txxpe
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #720
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/720/display/redirect?page=changes>
Changes:
[Robert Bradshaw] Work around coders bug on Dataflow.
[Robert Bradshaw] Add a test runner for running multiple pipelines in parallel.
[Robert Bradshaw] Run tests as single Dataflow pipeline.
[Robert Bradshaw] Github hook for dataflow pipelines.
[Robert Bradshaw] Guard dataflow run against GCP credentials.
[Robert Bradshaw] Guard running of precommit against having variables set.
[bulat.safiullin] [Website] add lazy loading attr to images #24250
[noreply] [Playground] Use current Go SDK by default (#24256)
[noreply] Fix dashboard links
[noreply] Bump github.com/aws/aws-sdk-go-v2/config from 1.18.1 to 1.18.2 in /sdks
[noreply] Add warning about google-cloud-platform-core dependency change in #24235
[noreply] Add GetSize implementation for DetectNewPartitions SDF (#23997)
[noreply] Add ZstdCoder to wrap coders with Zstandard compression (#24093)
[noreply] [#24261] Update to objsize 0.6.1 (#24262)
[noreply] Create template for failing tests. (#21728)
[noreply] Add record_metrics argument to utils.BatchElements (#23701)
[noreply] Performance test parameters followup fix (#24291)
------------------------------------------
[...truncated 34.59 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/22 13:57:08 Using specified **** binary: 'linux_amd64/combine'
2022/11/22 13:57:09 Prepared job with id: load-tests-go-flink-batch-combine-1-1122133803_06077a53-883e-433e-b9f1-2c2b25bba37f and staging token: load-tests-go-flink-batch-combine-1-1122133803_06077a53-883e-433e-b9f1-2c2b25bba37f
2022/11/22 13:57:14 Staged binary artifact with token:
2022/11/22 13:57:16 Submitted job: load0tests0go0flink0batch0combine0101122133803-root-1122135715-8758bb3a_307d6801-6288-4a0b-87c7-ab04914c8b7e
2022/11/22 13:57:16 Job state: STOPPED
2022/11/22 13:57:16 Job state: STARTING
2022/11/22 13:57:16 Job state: RUNNING
2022/11/22 14:50:45 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 4d5eed6ff669c73dd93e07bc69ff9b03)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669125238915_0001_01_000003(beam-loadtests-go-combine-flink-batch-720-w-1.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/11/22 14:50:45 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669125238915_0001_01_000003(beam-loadtests-go-combine-flink-batch-720-w-1.c.apache-beam-testing.internal:8026) timed out.
2022/11/22 14:50:46 Job state: FAILED
2022/11/22 14:50:46 Failed to execute job: job load0tests0go0flink0batch0combine0101122133803-root-1122135715-8758bb3a_307d6801-6288-4a0b-87c7-ab04914c8b7e failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101122133803-root-1122135715-8758bb3a_307d6801-6288-4a0b-87c7-ab04914c8b7e failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1678408, 0xc000130000}, {0x14da42d?, 0x202f968?}, {0xc000353e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 54m 11s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/wgaz7jmmfzc2k
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #719
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/719/display/redirect>
Changes:
------------------------------------------
[...truncated 34.57 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/21 08:45:15 Using specified **** binary: 'linux_amd64/combine'
2022/11/21 08:45:16 Prepared job with id: load-tests-go-flink-batch-combine-1-1121065327_20427cf3-c412-45a6-a118-8e74ae5983a7 and staging token: load-tests-go-flink-batch-combine-1-1121065327_20427cf3-c412-45a6-a118-8e74ae5983a7
2022/11/21 08:45:24 Staged binary artifact with token:
2022/11/21 08:45:27 Submitted job: load0tests0go0flink0batch0combine0101121065327-root-1121084525-991e87e3_796dd50e-ba77-4588-aa13-6d65857c91f3
2022/11/21 08:45:28 Job state: STOPPED
2022/11/21 08:45:28 Job state: STARTING
2022/11/21 08:45:28 Job state: RUNNING
2022/11/21 09:40:02 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 08596c60c2dac01f6c985a923cd8a863)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669020145425_0001_01_000005(beam-loadtests-go-combine-flink-batch-719-w-1.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/11/21 09:40:02 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1669020145425_0001_01_000005(beam-loadtests-go-combine-flink-batch-719-w-1.c.apache-beam-testing.internal:8026) timed out.
2022/11/21 09:40:03 Job state: FAILED
2022/11/21 09:40:03 Failed to execute job: job load0tests0go0flink0batch0combine0101121065327-root-1121084525-991e87e3_796dd50e-ba77-4588-aa13-6d65857c91f3 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101121065327-root-1121084525-991e87e3_796dd50e-ba77-4588-aa13-6d65857c91f3 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1678408, 0xc000128000}, {0x14da42d?, 0x202f968?}, {0xc0004efe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 55m 12s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/lmrsei7xv72aa
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #718
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/718/display/redirect?page=changes>
Changes:
[samuelw] Fix OrderedListState for Dataflow Streaming pipelines on SE.
------------------------------------------
[...truncated 34.69 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/20 08:46:00 Using specified **** binary: 'linux_amd64/combine'
2022/11/20 08:46:01 Prepared job with id: load-tests-go-flink-batch-combine-1-1120065326_b9f6a499-3368-480f-a8c8-489ec46ea104 and staging token: load-tests-go-flink-batch-combine-1-1120065326_b9f6a499-3368-480f-a8c8-489ec46ea104
2022/11/20 08:46:08 Staged binary artifact with token:
2022/11/20 08:46:10 Submitted job: load0tests0go0flink0batch0combine0101120065326-root-1120084609-e2389069_2953b90f-d91a-465f-9f96-08a8e011ad9e
2022/11/20 08:46:10 Job state: STOPPED
2022/11/20 08:46:10 Job state: STARTING
2022/11/20 08:46:10 Job state: RUNNING
2022/11/20 09:40:45 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 423c33182c97f0e562162cf6c6abba1f)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1668933766171_0001_01_000003(beam-loadtests-go-combine-flink-batch-718-w-1.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/11/20 09:40:45 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1668933766171_0001_01_000003(beam-loadtests-go-combine-flink-batch-718-w-1.c.apache-beam-testing.internal:8026) timed out.
2022/11/20 09:40:45 Job state: FAILED
2022/11/20 09:40:45 Failed to execute job: job load0tests0go0flink0batch0combine0101120065326-root-1120084609-e2389069_2953b90f-d91a-465f-9f96-08a8e011ad9e failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101120065326-root-1120084609-e2389069_2953b90f-d91a-465f-9f96-08a8e011ad9e failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1678408, 0xc000128000}, {0x14da42d?, 0x202f968?}, {0xc000409e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 55m 7s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/3q6sg37dxuubu
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #717
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/717/display/redirect?page=changes>
Changes:
[Kenneth Knowles] Remove overly broad CanIgnoreReturnValue
[Kenneth Knowles] Add @RunWith annotation to pubsublite test
[noreply] Move dashboard links to dedicated section
[noreply] [Spark dataset runner] Cache datasets if used multiple times (#24009)
[noreply] Remove section from troubleshooting about fixed dictionary issue
[noreply] Fix flink XVR tests (#24228)
[noreply] Adding the list of example notebooks to the ML readme file. (#24255)
[noreply] Updating timezone for Beam 2.43.0 release (#24258)
[Kenneth Knowles] Only skip checkerframework if explicitly requested
[noreply] Issue#21430 Updated dataframe io to avoid pruning
[noreply] SingleStoreIO (#23535)
[chamikaramj] Update Java Multi-lang quickstart after the Beam 2.43.0 release
------------------------------------------
[...truncated 34.62 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/19 08:45:33 Using specified **** binary: 'linux_amd64/combine'
2022/11/19 08:45:34 Prepared job with id: load-tests-go-flink-batch-combine-1-1119065338_817728a9-2930-43af-b572-724f5ea90e51 and staging token: load-tests-go-flink-batch-combine-1-1119065338_817728a9-2930-43af-b572-724f5ea90e51
2022/11/19 08:45:43 Staged binary artifact with token:
2022/11/19 08:45:44 Submitted job: load0tests0go0flink0batch0combine0101119065338-root-1119084543-54b546ef_cd486353-772d-4aa7-b1c8-3defd1684990
2022/11/19 08:45:44 Job state: STOPPED
2022/11/19 08:45:44 Job state: STARTING
2022/11/19 08:45:44 Job state: RUNNING
2022/11/19 09:40:28 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 509a8f461cc37e25c1846ca5e93b39f3)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1668847357477_0001_01_000003(beam-loadtests-go-combine-flink-batch-717-w-2.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/11/19 09:40:28 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1668847357477_0001_01_000003(beam-loadtests-go-combine-flink-batch-717-w-2.c.apache-beam-testing.internal:8026) timed out.
2022/11/19 09:40:28 Job state: FAILED
2022/11/19 09:40:28 Failed to execute job: job load0tests0go0flink0batch0combine0101119065338-root-1119084543-54b546ef_cd486353-772d-4aa7-b1c8-3defd1684990 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101119065338-root-1119084543-54b546ef_cd486353-772d-4aa7-b1c8-3defd1684990 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1678408, 0xc00004e0d0}, {0x14da42d?, 0x202f968?}, {0xc00035de70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 55m 18s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/crpug4kxkibbe
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #716
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/716/display/redirect?page=changes>
Changes:
[Moritz Mack] [Metrics] Add new performance dashboard for Java JMH benchmarks (closes
[Kenneth Knowles] Fix checkArgument format string in ByteKeyRange
[egalpin] Uses _all to follow alias/datastreams when estimating index size
[Yi Hu] Unify test parameters for certain IOs based on test row and grafana
[egalpin] Adds test for following aliases when estimating index size
[noreply] Bump github.com/aws/aws-sdk-go-v2/config from 1.18.0 to 1.18.1 in /sdks
[noreply] Add enableGzipHttpCompression option to SplunkIO (#24197)
[noreply] [Playground] Examples CI restore (#24155)
[noreply] Bump github.com/aws/aws-sdk-go-v2/service/s3 in /sdks (#24220)
[noreply] Issue#24161 Updated docstring for Clusters class
[Kenneth Knowles] Fix checkArgument format string in BigQueryQueryHelper
[Kenneth Knowles] Fix checkArgument calls in BQ dynamic destinations
[noreply] Force discarding mode in with_fanout without rewindowing. (#23828)
[noreply] Removed trailing whitespaces.
[noreply] Clarify that SDF authors need to make the restriction sizing method
[noreply] Remove google-cloud-platform-core dependency from harness (#24235)
[noreply] Document our benchmarks (#24216)
[noreply] Website updates for Beam 2.43.0 release (#24044)
[Kenneth Knowles] Add @RunWith annotation to BQ test class
[chamikaramj] Fix release date
[chamikaramj] Few more fixes to the Website
------------------------------------------
[...truncated 34.58 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/18 08:45:41 Using specified **** binary: 'linux_amd64/combine'
2022/11/18 08:45:42 Prepared job with id: load-tests-go-flink-batch-combine-1-1118065337_dec62482-86c0-4943-aa87-5cdbc295bec0 and staging token: load-tests-go-flink-batch-combine-1-1118065337_dec62482-86c0-4943-aa87-5cdbc295bec0
2022/11/18 08:45:49 Staged binary artifact with token:
2022/11/18 08:45:51 Submitted job: load0tests0go0flink0batch0combine0101118065337-root-1118084550-6ff2f734_f75d4302-f533-4824-97df-fb4fd087a4d5
2022/11/18 08:45:51 Job state: STOPPED
2022/11/18 08:45:51 Job state: STARTING
2022/11/18 08:45:51 Job state: RUNNING
2022/11/18 09:39:44 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: e46b04c620eb13f37008228de8f0f9c4)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1668760960159_0001_01_000004(beam-loadtests-go-combine-flink-batch-716-w-1.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/11/18 09:39:44 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1668760960159_0001_01_000004(beam-loadtests-go-combine-flink-batch-716-w-1.c.apache-beam-testing.internal:8026) timed out.
2022/11/18 09:39:44 Job state: FAILED
2022/11/18 09:39:44 Failed to execute job: job load0tests0go0flink0batch0combine0101118065337-root-1118084550-6ff2f734_f75d4302-f533-4824-97df-fb4fd087a4d5 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101118065337-root-1118084550-6ff2f734_f75d4302-f533-4824-97df-fb4fd087a4d5 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1678408, 0xc000128000}, {0x14da42d?, 0x202f968?}, {0xc0005dbe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 54m 34s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/z6thpvnykpaoq
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #715
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/715/display/redirect?page=changes>
Changes:
[chamikaramj] Add a reference to Java RunInference example
[Jozef Vilcek] Re-use serializable pipeline options when already available (#24192)
[lakshmanansathya] refs: issue-24196, fix broken hyperlink
[noreply] Fix Python PostCommit Example CustomPTransformIT on portable (#24159)
[noreply] revert upgrade to go 1.19 for action unit tests (#24189)
[noreply] Use only ValueProviders in SpannerConfig (#24156)
[noreply] [Tour of Beam] [Frontend] Content tree URLs (#23776)
[noreply] Python TextIO Performance Test (#23951)
[Chamikara Madhusanka Jayalath] Temporary update Python RC validation job
[Chamikara Madhusanka Jayalath] updates
[Chamikara Madhusanka Jayalath] updates
[noreply] Fix PythonLint (#24219)
[noreply] Bump loader-utils from 1.4.1 to 1.4.2 in
------------------------------------------
[...truncated 34.67 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/17 08:45:51 Using specified **** binary: 'linux_amd64/combine'
2022/11/17 08:45:51 Prepared job with id: load-tests-go-flink-batch-combine-1-1117065328_6222a0cd-44c1-4509-806f-85aee036ba10 and staging token: load-tests-go-flink-batch-combine-1-1117065328_6222a0cd-44c1-4509-806f-85aee036ba10
2022/11/17 08:45:58 Staged binary artifact with token:
2022/11/17 08:46:00 Submitted job: load0tests0go0flink0batch0combine0101117065328-root-1117084558-2ed6fb1d_6786d6c4-1d1d-4850-b56f-087948b4f79f
2022/11/17 08:46:00 Job state: STOPPED
2022/11/17 08:46:00 Job state: STARTING
2022/11/17 08:46:00 Job state: RUNNING
2022/11/17 09:40:12 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: f98b901c2bbc0368fd4a887074e1cded)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1668674581099_0001_01_000002(beam-loadtests-go-combine-flink-batch-715-w-3.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/11/17 09:40:12 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1668674581099_0001_01_000002(beam-loadtests-go-combine-flink-batch-715-w-3.c.apache-beam-testing.internal:8026) timed out.
2022/11/17 09:40:12 Job state: FAILED
2022/11/17 09:40:12 Failed to execute job: job load0tests0go0flink0batch0combine0101117065328-root-1117084558-2ed6fb1d_6786d6c4-1d1d-4850-b56f-087948b4f79f failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101117065328-root-1117084558-2ed6fb1d_6786d6c4-1d1d-4850-b56f-087948b4f79f failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1678408, 0xc00004e0d0}, {0x14da42d?, 0x202f968?}, {0xc000351e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 54m 42s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/76qkhhtsiwl52
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #714
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/714/display/redirect?page=changes>
Changes:
[Kenneth Knowles] Fix arguments to checkState in BatchViewOverrides
[bulat.safiullin] [Website] update pre tag copy link styles #23064
[noreply] [Dockerized Jenkins] Update README how to use local repo (#24055)
[noreply] [Dockerized Jenkins] Fix build of dockerized jenkins (fixes #24053)
[noreply] Bump github.com/aws/aws-sdk-go-v2/feature/s3/manager in /sdks (#24131)
[noreply] Editorial review of the ML notebooks. (#24125)
[noreply] Configure flutter_code_editor options with Hugo shortcode (#23926)
[noreply] Eliminate CalciteUtil.CharType logical type (#24013)
[noreply] [Playground] Move Playground in GKE and Infrastructure change (#23928)
[noreply] Fix broken notebook (#24179)
[noreply] Add error reporting for BatchConverter match failure (#24022)
[noreply] Update automation to use Go 1.19 (#24175)
[noreply] Fix broken json for notebook (#24183)
[noreply] Using Teardown context instead of deprecated finalize (#24180)
[noreply] [Python]Support pipe operator as Union (PEP -604) (#24106)
[noreply] Updated README of Interactive Beam
[noreply] Minor update
[noreply] Add custom inference function support to the PyTorch model handler
[noreply] Strip FGAC database role from changestreams metadata requests (#24177)
------------------------------------------
[...truncated 34.59 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/16 08:45:52 Using specified **** binary: 'linux_amd64/combine'
2022/11/16 08:45:53 Prepared job with id: load-tests-go-flink-batch-combine-1-1116065320_d8bae057-3b29-4567-96f1-e785630aea98 and staging token: load-tests-go-flink-batch-combine-1-1116065320_d8bae057-3b29-4567-96f1-e785630aea98
2022/11/16 08:45:58 Staged binary artifact with token:
2022/11/16 08:45:59 Submitted job: load0tests0go0flink0batch0combine0101116065320-root-1116084558-cf57bcae_07837abd-be9c-4a83-9ba6-defc32d2599c
2022/11/16 08:45:59 Job state: STOPPED
2022/11/16 08:45:59 Job state: STARTING
2022/11/16 08:45:59 Job state: RUNNING
2022/11/16 09:39:21 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: ef24db8c17e82f6b06407751b5430346)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1668588164821_0002_01_000003(beam-loadtests-go-combine-flink-batch-714-w-2.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/11/16 09:39:21 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1668588164821_0002_01_000003(beam-loadtests-go-combine-flink-batch-714-w-2.c.apache-beam-testing.internal:8026) timed out.
2022/11/16 09:39:21 Job state: FAILED
2022/11/16 09:39:21 Failed to execute job: job load0tests0go0flink0batch0combine0101116065320-root-1116084558-cf57bcae_07837abd-be9c-4a83-9ba6-defc32d2599c failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101116065320-root-1116084558-cf57bcae_07837abd-be9c-4a83-9ba6-defc32d2599c failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1678408, 0xc00004e0d0}, {0x14da42d?, 0x202f968?}, {0xc00066be70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3bf
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 54m 4s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/piay72bbq4q6e
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #713
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/713/display/redirect?page=changes>
Changes:
[noreply] Bump loader-utils
[chamikaramj] Updates Multi-lang Java quickstart
[Kenneth Knowles] Fix checkArgument format in GcsPath
[noreply] [Tour Of Beam] verify that unit exists when saving progress (#24118)
[noreply] Cleanup stale BQ datasets (#24158)
[noreply] Support SqlTypes Date and Timestamp (MicrosInstant) in AvroUtils
[noreply] Add more tests for S3 filesystem (#24138)
[noreply] Merge pull request #23333: Track time on Cloud Dataflow streaming data
[Robert Bradshaw] Rename the test_splits flag to direct_test_splits.
[noreply] Adding a quickstart to README for the TS SDK (#23509)
[noreply] More dataset templates to clean up (#24162)
[noreply] Implement embedded WebAssembly example (#24081)
------------------------------------------
[...truncated 34.64 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/15 08:45:46 Using specified **** binary: 'linux_amd64/combine'
2022/11/15 08:45:47 Prepared job with id: load-tests-go-flink-batch-combine-1-1115065340_7f6c0955-c3db-4c40-bd91-381d03754b98 and staging token: load-tests-go-flink-batch-combine-1-1115065340_7f6c0955-c3db-4c40-bd91-381d03754b98
2022/11/15 08:45:53 Staged binary artifact with token:
2022/11/15 08:45:54 Submitted job: load0tests0go0flink0batch0combine0101115065340-root-1115084553-d31af3dc_f08b477b-0e18-45ce-b4d6-ffa2ecdcd127
2022/11/15 08:45:55 Job state: STOPPED
2022/11/15 08:45:55 Job state: STARTING
2022/11/15 08:45:55 Job state: RUNNING
2022/11/15 09:39:18 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 40d095b3cfacc1af74d166fd14ef646e)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at org.apache.flink.runtime.scheduler.UpdateSchedulerNgOnInternalFailuresListener.notifyTaskFailure(UpdateSchedulerNgOnInternalFailuresListener.java:51)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraph.notifySchedulerNgAboutInternalTaskFailure(DefaultExecutionGraph.java:1536)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1118)
at org.apache.flink.runtime.executiongraph.Execution.processFail(Execution.java:1058)
at org.apache.flink.runtime.executiongraph.Execution.fail(Execution.java:759)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.signalPayloadRelease(SingleLogicalSlot.java:195)
at org.apache.flink.runtime.jobmaster.slotpool.SingleLogicalSlot.release(SingleLogicalSlot.java:182)
at org.apache.flink.runtime.scheduler.SharedSlot.lambda$release$4(SharedSlot.java:271)
at java.util.concurrent.CompletableFuture.uniAcceptNow(CompletableFuture.java:753)
at java.util.concurrent.CompletableFuture.uniAcceptStage(CompletableFuture.java:731)
at java.util.concurrent.CompletableFuture.thenAccept(CompletableFuture.java:2108)
at org.apache.flink.runtime.scheduler.SharedSlot.release(SharedSlot.java:271)
at org.apache.flink.runtime.jobmaster.slotpool.AllocatedSlot.releasePayload(AllocatedSlot.java:152)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releasePayload(DefaultDeclarativeSlotPool.java:482)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.freeAndReleaseSlots(DefaultDeclarativeSlotPool.java:474)
at org.apache.flink.runtime.jobmaster.slotpool.DefaultDeclarativeSlotPool.releaseSlots(DefaultDeclarativeSlotPool.java:445)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.internalReleaseTaskManager(DeclarativeSlotPoolService.java:249)
at org.apache.flink.runtime.jobmaster.slotpool.DeclarativeSlotPoolService.releaseTaskManager(DeclarativeSlotPoolService.java:230)
at org.apache.flink.runtime.jobmaster.JobMaster.disconnectTaskManager(JobMaster.java:505)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.handleTaskManagerConnectionLoss(JobMaster.java:1376)
at org.apache.flink.runtime.jobmaster.JobMaster$TaskManagerHeartbeatListener.notifyHeartbeatTimeout(JobMaster.java:1371)
at org.apache.flink.runtime.heartbeat.HeartbeatMonitorImpl.run(HeartbeatMonitorImpl.java:155)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
at java.util.concurrent.FutureTask.run(FutureTask.java:264)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRunAsync$4(AkkaRpcActor.java:443)
at org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:68)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRunAsync(AkkaRpcActor.java:443)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:213)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1668501770355_0001_01_000004(beam-loadtests-go-combine-flink-batch-713-w-0.c.apache-beam-testing.internal:8026) timed out.
... 31 more
2022/11/15 09:39:18 (): java.util.concurrent.TimeoutException: Heartbeat of TaskManager with id container_1668501770355_0001_01_000004(beam-loadtests-go-combine-flink-batch-713-w-0.c.apache-beam-testing.internal:8026) timed out.
2022/11/15 09:39:18 Job state: FAILED
2022/11/15 09:39:18 Failed to execute job: job load0tests0go0flink0batch0combine0101115065340-root-1115084553-d31af3dc_f08b477b-0e18-45ce-b4d6-ffa2ecdcd127 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101115065340-root-1115084553-d31af3dc_f08b477b-0e18-45ce-b4d6-ffa2ecdcd127 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1634f28, 0xc00012e000}, {0x14964a9?, 0x1fb46e0?}, {0xc00062de70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 53m 51s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/i4esmmxoojxca
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #712
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/712/display/redirect?page=changes>
Changes:
[bulat.safiullin] [Website] change headers size from h4,h3 to h2 #24082
[bulat.safiullin] [Website] update go-dependencies.md java-dependencies.md
[Kenneth Knowles] Fix checkArgument format string in AvroIO
[Kenneth Knowles] Remove extraneous jetbrains annotation
[noreply] Bump golang.org/x/net from 0.1.0 to 0.2.0 in /sdks (#24153)
[noreply] Make MonotonicWatermarkEstimator work like its Java SDK equivalent
[noreply] Test Dataproc 2.1 with Flink load tests (#24129)
[noreply] Change DataflowBatchWorkerHarness doWork error level to INFO (#24135)
[noreply] Bump github.com/aws/aws-sdk-go-v2/config from 1.17.10 to 1.18.0 in /sdks
------------------------------------------
[...truncated 33.92 KB...]
component_coder_ids: "c9"
>
>
coders: <
key: "c11"
value: <
spec: <
urn: "beam:coder:iterable:v1"
>
component_coder_ids: "c0"
>
>
coders: <
key: "c12"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c11"
>
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/14 17:53:24 Using specified **** binary: 'linux_amd64/combine'
2022/11/14 17:53:25 Prepared job with id: load-tests-go-flink-batch-combine-1-1114150136_dfe6793d-9532-43ef-a137-f9c92aebb642 and staging token: load-tests-go-flink-batch-combine-1-1114150136_dfe6793d-9532-43ef-a137-f9c92aebb642
2022/11/14 17:53:31 Staged binary artifact with token:
2022/11/14 17:53:33 Submitted job: load0tests0go0flink0batch0combine0101114150136-root-1114175331-5645fcff_0e7028c0-f867-4594-b9fd-7fcb2c6bfc49
2022/11/14 17:53:33 Job state: STOPPED
2022/11/14 17:53:33 Job state: STARTING
2022/11/14 17:53:33 Job state: RUNNING
2022/11/14 17:53:47 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: org.apache.flink.runtime.client.JobInitializationException: Could not start the JobMaster.
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: org.apache.flink.runtime.client.JobInitializationException: Could not start the JobMaster.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: org.apache.flink.runtime.client.JobInitializationException: Could not start the JobMaster.
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:75)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
... 1 more
Caused by: org.apache.flink.runtime.client.JobInitializationException: Could not start the JobMaster.
at org.apache.flink.runtime.jobmaster.DefaultJobMasterServiceProcess.lambda$new$0(DefaultJobMasterServiceProcess.java:97)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:859)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:837)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:506)
at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1705)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.lang.Thread.run(Thread.java:829)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.api.common.InvalidProgramException: The job graph is cyclic.
at java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:314)
at java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:319)
at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1702)
... 3 more
Caused by: org.apache.flink.api.common.InvalidProgramException: The job graph is cyclic.
at org.apache.flink.runtime.jobgraph.JobGraph.getVerticesSortedTopologicallyFromSources(JobGraph.java:442)
at org.apache.flink.runtime.executiongraph.DefaultExecutionGraphBuilder.buildGraph(DefaultExecutionGraphBuilder.java:186)
at org.apache.flink.runtime.scheduler.DefaultExecutionGraphFactory.createAndRestoreExecutionGraph(DefaultExecutionGraphFactory.java:149)
at org.apache.flink.runtime.scheduler.SchedulerBase.createAndRestoreExecutionGraph(SchedulerBase.java:363)
at org.apache.flink.runtime.scheduler.SchedulerBase.<init>(SchedulerBase.java:208)
at org.apache.flink.runtime.scheduler.DefaultScheduler.<init>(DefaultScheduler.java:191)
at org.apache.flink.runtime.scheduler.DefaultScheduler.<init>(DefaultScheduler.java:139)
at org.apache.flink.runtime.scheduler.DefaultSchedulerFactory.createInstance(DefaultSchedulerFactory.java:135)
at org.apache.flink.runtime.jobmaster.DefaultSlotPoolServiceSchedulerFactory.createScheduler(DefaultSlotPoolServiceSchedulerFactory.java:115)
at org.apache.flink.runtime.jobmaster.JobMaster.createScheduler(JobMaster.java:345)
at org.apache.flink.runtime.jobmaster.JobMaster.<init>(JobMaster.java:322)
at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.internalCreateJobMasterService(DefaultJobMasterServiceFactory.java:106)
at org.apache.flink.runtime.jobmaster.factories.DefaultJobMasterServiceFactory.lambda$createJobMasterService$0(DefaultJobMasterServiceFactory.java:94)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedSupplier$4(FunctionUtils.java:112)
at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1700)
... 3 more
2022/11/14 17:53:47 (): org.apache.flink.api.common.InvalidProgramException: The job graph is cyclic.
2022/11/14 17:53:47 Job state: FAILED
2022/11/14 17:53:47 Failed to execute job: job load0tests0go0flink0batch0combine0101114150136-root-1114175331-5645fcff_0e7028c0-f867-4594-b9fd-7fcb2c6bfc49 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101114150136-root-1114175331-5645fcff_0e7028c0-f867-4594-b9fd-7fcb2c6bfc49 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1634f28, 0xc00004a0d0}, {0x14964a9?, 0x1fb46e0?}, {0xc000323e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 57s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/k64s6bkdh2ovm
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #711
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/711/display/redirect?page=changes>
Changes:
[Kenneth Knowles] Fix checkArgument format string in ExecutionStateTracker
------------------------------------------
[...truncated 34.08 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/14 08:44:10 Using specified **** binary: 'linux_amd64/combine'
2022/11/14 08:44:11 Prepared job with id: load-tests-go-flink-batch-combine-1-1114065323_ac7cb896-0a86-4abf-a505-9c1c0cf9dee5 and staging token: load-tests-go-flink-batch-combine-1-1114065323_ac7cb896-0a86-4abf-a505-9c1c0cf9dee5
2022/11/14 08:44:15 Staged binary artifact with token:
2022/11/14 08:44:16 Submitted job: load0tests0go0flink0batch0combine0101114065323-root-1114084415-a5c8068a_5d332212-1fd9-4a14-9763-3cd593802849
2022/11/14 08:44:16 Job state: STOPPED
2022/11/14 08:44:16 Job state: STARTING
2022/11/14 08:44:16 Job state: RUNNING
2022/11/14 08:45:25 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/11/14 08:45:25 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/11/14 08:45:25 Job state: FAILED
2022/11/14 08:45:25 Failed to execute job: job load0tests0go0flink0batch0combine0101114065323-root-1114084415-a5c8068a_5d332212-1fd9-4a14-9763-3cd593802849 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101114065323-root-1114084415-a5c8068a_5d332212-1fd9-4a14-9763-3cd593802849 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1634d28, 0xc00004a0d0}, {0x149646d?, 0x1fb3680?}, {0xc000165e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 39s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/rwm4kzfm5nvoq
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #710
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/710/display/redirect>
Changes:
------------------------------------------
[...truncated 33.90 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/13 08:43:36 Using specified **** binary: 'linux_amd64/combine'
2022/11/13 08:43:37 Prepared job with id: load-tests-go-flink-batch-combine-1-1113065319_128b19c9-5c17-4d4c-821d-48704ed6938d and staging token: load-tests-go-flink-batch-combine-1-1113065319_128b19c9-5c17-4d4c-821d-48704ed6938d
2022/11/13 08:43:41 Staged binary artifact with token:
2022/11/13 08:43:42 Submitted job: load0tests0go0flink0batch0combine0101113065319-root-1113084341-87289856_408ab259-a9e5-40d8-8694-6c969bc73f9d
2022/11/13 08:43:42 Job state: STOPPED
2022/11/13 08:43:42 Job state: STARTING
2022/11/13 08:43:42 Job state: RUNNING
2022/11/13 08:44:51 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/11/13 08:44:51 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/11/13 08:44:51 Job state: FAILED
2022/11/13 08:44:51 Failed to execute job: job load0tests0go0flink0batch0combine0101113065319-root-1113084341-87289856_408ab259-a9e5-40d8-8694-6c969bc73f9d failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101113065319-root-1113084341-87289856_408ab259-a9e5-40d8-8694-6c969bc73f9d failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1634d28, 0xc00012e000}, {0x149646d?, 0x1fb3680?}, {0xc0000dbe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 32s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/4moccqhcihgas
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #709
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/709/display/redirect?page=changes>
Changes:
[noreply] Add TFX support in pydoc (#23960)
[noreply] Bump cloud.google.com/go/bigtable from 1.17.0 to 1.18.0 in /sdks
[noreply] disable (#24121)
[noreply] Implement PubsubRowToMessage transform (#23897)
[noreply] upgrade testcontainer dependency (#24123)
[noreply] More cleanup containers (#24105)
[noreply] Bump github.com/aws/aws-sdk-go-v2/service/s3 in /sdks (#24112)
[noreply] Bump google.golang.org/api from 0.102.0 to 0.103.0 in /sdks (#24049)
[noreply] Update staging of Python wheels (#24114)
[noreply] Add a ValidatesContainer integration test for use_sibling_sdk_workers
[noreply] Fix checkArgument format string in TestStream (#24134)
------------------------------------------
[...truncated 33.92 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/12 08:44:10 Using specified **** binary: 'linux_amd64/combine'
2022/11/12 08:44:11 Prepared job with id: load-tests-go-flink-batch-combine-1-1112065321_4a7d0007-00f2-4b33-9acd-65d264d05d6d and staging token: load-tests-go-flink-batch-combine-1-1112065321_4a7d0007-00f2-4b33-9acd-65d264d05d6d
2022/11/12 08:44:15 Staged binary artifact with token:
2022/11/12 08:44:16 Submitted job: load0tests0go0flink0batch0combine0101112065321-root-1112084415-5203f64f_fb7ec72d-327e-4f6f-959b-e9927c505959
2022/11/12 08:44:16 Job state: STOPPED
2022/11/12 08:44:16 Job state: STARTING
2022/11/12 08:44:16 Job state: RUNNING
2022/11/12 08:45:25 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/11/12 08:45:25 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/11/12 08:45:25 Job state: FAILED
2022/11/12 08:45:25 Failed to execute job: job load0tests0go0flink0batch0combine0101112065321-root-1112084415-5203f64f_fb7ec72d-327e-4f6f-959b-e9927c505959 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101112065321-root-1112084415-5203f64f_fb7ec72d-327e-4f6f-959b-e9927c505959 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1634d28, 0xc00004a0d0}, {0x149646d?, 0x1fb3680?}, {0xc0003f5e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 47s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/w56dmkbjzeexm
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #708
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/708/display/redirect?page=changes>
Changes:
[Robert Bradshaw] Compute element counts for all PCollections.
[Robert Bradshaw] Add the ability to schedule splits on the ULR via a pipeline option.
[Robert Bradshaw] Add the a Reshuffle operation and use it in Create.
[Robert Bradshaw] Add dynamic splitting support to the worker.
[noreply] Update style
[Robert Bradshaw] Clarifying comments.
[Robert Bradshaw] Make mypy happy.
[Robert Bradshaw] Reduce flakiness of time-based split manager test.
[noreply] Fix FhirIO javadoc format broken (#24072)
[noreply] Bump github.com/aws/aws-sdk-go-v2/service/s3 in /sdks (#24077)
[noreply] [BEAM-12792] Install pipline dependencies to temporary venv (#16658)
[noreply] [Python]Set pickle library at the Pipeline creation stage (#24069)
[noreply] Improving stale container cleanup script (#24040)
[noreply] Add random string at the end of BigQuery query job name to make it
[noreply] [Playground] update snippet by persistence_key (#24056)
[noreply] [Tour Of Beam] handle CORS pre-flight requests (#24083)
[noreply] Num failed inferences (#23830)
[noreply] Bump github.com/aws/aws-sdk-go-v2/config from 1.5.0 to 1.17.10 in /sdks
[noreply] Add blog post on new ML resources (#24071)
[noreply] fixing linter error (#24104)
[noreply] Support using BigQueryIO Storage Read API with SchemaTransforms (#23827)
[noreply] Wire SamzaPipelineOptions to Exeption listener interface (#24109)
[noreply] Remove TheNeuralBit from the pool of Python reviewers (#24108)
------------------------------------------
[...truncated 33.99 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/11 08:43:56 Using specified **** binary: 'linux_amd64/combine'
2022/11/11 08:43:57 Prepared job with id: load-tests-go-flink-batch-combine-1-1111065334_f3e30cf2-9b3e-4132-aa9e-d7596f6a9342 and staging token: load-tests-go-flink-batch-combine-1-1111065334_f3e30cf2-9b3e-4132-aa9e-d7596f6a9342
2022/11/11 08:44:01 Staged binary artifact with token:
2022/11/11 08:44:03 Submitted job: load0tests0go0flink0batch0combine0101111065334-root-1111084402-73f9b77a_66754eff-c45b-44cf-8a99-2621ad1d86b0
2022/11/11 08:44:03 Job state: STOPPED
2022/11/11 08:44:03 Job state: STARTING
2022/11/11 08:44:03 Job state: RUNNING
2022/11/11 08:45:11 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/11/11 08:45:11 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/11/11 08:45:11 Job state: FAILED
2022/11/11 08:45:11 Failed to execute job: job load0tests0go0flink0batch0combine0101111065334-root-1111084402-73f9b77a_66754eff-c45b-44cf-8a99-2621ad1d86b0 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101111065334-root-1111084402-73f9b77a_66754eff-c45b-44cf-8a99-2621ad1d86b0 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x16339a8, 0xc00004a0d0}, {0x149532d?, 0x1fb15c0?}, {0xc000561e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 38s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/dc5wbgnbtcyss
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #707
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/707/display/redirect?page=changes>
Changes:
[Moritz Mack] [Spark Dataset runner] Enable projection pushdown for Spark dataset
[noreply] Fix dependency mismatch in Playground Java runner (#24059)
[noreply] added comments for tensorflow notebook (#23726)
[noreply] Convert initialisms to all caps (#24061)
[noreply] skip output coder field in exp request (#24066)
[noreply] test: add more tests to throughput estimator (#23915)
[noreply] Remove a duplicate label (#24043)
[noreply] Update datastore_wordcount.py (#23724)
------------------------------------------
[...truncated 33.92 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/10 08:44:01 Using specified **** binary: 'linux_amd64/combine'
2022/11/10 08:44:01 Prepared job with id: load-tests-go-flink-batch-combine-1-1110065326_86939c46-cf57-4c73-bf77-4032759a2353 and staging token: load-tests-go-flink-batch-combine-1-1110065326_86939c46-cf57-4c73-bf77-4032759a2353
2022/11/10 08:44:05 Staged binary artifact with token:
2022/11/10 08:44:06 Submitted job: load0tests0go0flink0batch0combine0101110065326-root-1110084405-cdb04f8b_72fa99c0-3fe5-4dc3-a375-f3a441680cae
2022/11/10 08:44:06 Job state: STOPPED
2022/11/10 08:44:06 Job state: STARTING
2022/11/10 08:44:06 Job state: RUNNING
2022/11/10 08:45:15 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/11/10 08:45:15 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/11/10 08:45:15 Job state: FAILED
2022/11/10 08:45:15 Failed to execute job: job load0tests0go0flink0batch0combine0101110065326-root-1110084405-cdb04f8b_72fa99c0-3fe5-4dc3-a375-f3a441680cae failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101110065326-root-1110084405-cdb04f8b_72fa99c0-3fe5-4dc3-a375-f3a441680cae failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x16339a8, 0xc0001a8000}, {0x149532d?, 0x1fb15c0?}, {0xc0000e3e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 46s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/w2ko2c5vwye4a
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #706
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/706/display/redirect?page=changes>
Changes:
[noreply] Update release notes. (#23986)
[noreply] [Go] Pipeline Resource Hints (#23990)
[noreply] [#21250] Trivial removal of loop over something that always has one
[noreply] Bump cloud.google.com/go/bigtable from 1.16.0 to 1.17.0 in /sdks
[noreply] Editorial review of the ML base API descriptions (#24026)
[noreply] Update my Twitter handle (#23653)
[noreply] Retroactively announce Batched DoFn support in 2.42.0 Blog (#24011)
[noreply] Bump cloud.google.com/go/storage from 1.27.0 to 1.28.0 in /sdks (#24028)
[noreply] [Go] Add pipeline resource hints to CHANGES.md (#24036)
[noreply] Handle Avro schema generation for logical data types in
[noreply] [Go SDK] S3 implementation of the Beam filesystem (#23992)
------------------------------------------
[...truncated 34.21 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/09 08:44:43 Using specified **** binary: 'linux_amd64/combine'
2022/11/09 08:44:43 Prepared job with id: load-tests-go-flink-batch-combine-1-1109065411_5c83334d-c929-43a9-9b6a-ca74506effb6 and staging token: load-tests-go-flink-batch-combine-1-1109065411_5c83334d-c929-43a9-9b6a-ca74506effb6
2022/11/09 08:44:48 Staged binary artifact with token:
2022/11/09 08:44:49 Submitted job: load0tests0go0flink0batch0combine0101109065411-root-1109084448-db46aa98_f102671d-f140-4999-9536-cb288cc29cf3
2022/11/09 08:44:49 Job state: STOPPED
2022/11/09 08:44:49 Job state: STARTING
2022/11/09 08:44:49 Job state: RUNNING
2022/11/09 08:45:59 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/11/09 08:45:59 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/11/09 08:45:59 Job state: FAILED
2022/11/09 08:45:59 Failed to execute job: job load0tests0go0flink0batch0combine0101109065411-root-1109084448-db46aa98_f102671d-f140-4999-9536-cb288cc29cf3 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101109065411-root-1109084448-db46aa98_f102671d-f140-4999-9536-cb288cc29cf3 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1633b88, 0xc0001a8000}, {0x14954cd?, 0x1fb15c0?}, {0xc0006e5e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 48s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/pdss46uxc2k7a
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #705
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/705/display/redirect?page=changes>
Changes:
[Robert Bradshaw] Better surfacing of Scala support via Scio.
[bulat.safiullin] [Website] change case-study-card width on mobile
[vitaly.terentyev] Add sparkreceiver:2 module.
[vitaly.terentyev] Fix sparkreceiver dependencies
[noreply] Print diff and scope to state path
[noreply] Correctly print diff and swallow empty commits for the moment
[noreply] Remove quiet flag on debug
[noreply] Use git diff instead of git diff-index to avoid file timestamp changes
[noreply] Make `documentation/io/connectors/` canonical (#23877)
[noreply] [Tour of Beam] Learning content for "Introduction" module (#23085)
[noreply] feat: implement bigtable io connector with write capabilities (#23411)
[noreply] Bump google.golang.org/api from 0.101.0 to 0.102.0 in /sdks (#23957)
[noreply] Enforce splitting invariants by ensuring split state is reset in the
[noreply] Add files then check cached diff to get untracked files
[noreply] Switch && for || to fix bug in #23889 resolution (#24017)
------------------------------------------
[...truncated 34.07 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/08 08:44:08 Using specified **** binary: 'linux_amd64/combine'
2022/11/08 08:44:08 Prepared job with id: load-tests-go-flink-batch-combine-1-1108065321_7226178e-485f-444e-9193-b192097c82f3 and staging token: load-tests-go-flink-batch-combine-1-1108065321_7226178e-485f-444e-9193-b192097c82f3
2022/11/08 08:44:12 Staged binary artifact with token:
2022/11/08 08:44:14 Submitted job: load0tests0go0flink0batch0combine0101108065321-root-1108084413-ce89a78e_30c0f64d-1eaf-4068-beb8-d5a0eb568487
2022/11/08 08:44:14 Job state: STOPPED
2022/11/08 08:44:14 Job state: STARTING
2022/11/08 08:44:14 Job state: RUNNING
2022/11/08 08:45:23 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/11/08 08:45:23 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/11/08 08:45:23 Job state: FAILED
2022/11/08 08:45:23 Failed to execute job: job load0tests0go0flink0batch0combine0101108065321-root-1108084413-ce89a78e_30c0f64d-1eaf-4068-beb8-d5a0eb568487 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101108065321-root-1108084413-ce89a78e_30c0f64d-1eaf-4068-beb8-d5a0eb568487 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x162bce8, 0xc0001a6000}, {0x148e389?, 0x1fa6280?}, {0xc0003c3e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 41s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/baof4ey3euzec
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #704
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/704/display/redirect?page=changes>
Changes:
[noreply] Fix diff to stop repeated bot runs
[noreply] Fix pr bot - exec doesn't allow command chaining
[noreply] PR Bot - Dont throw error on return code 1
------------------------------------------
[...truncated 34.02 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/07 08:43:49 Using specified **** binary: 'linux_amd64/combine'
2022/11/07 08:43:50 Prepared job with id: load-tests-go-flink-batch-combine-1-1107065317_9344d394-c029-4461-a2b4-9bf59708b5ef and staging token: load-tests-go-flink-batch-combine-1-1107065317_9344d394-c029-4461-a2b4-9bf59708b5ef
2022/11/07 08:43:54 Staged binary artifact with token:
2022/11/07 08:43:55 Submitted job: load0tests0go0flink0batch0combine0101107065317-root-1107084354-faffd37a_d507d561-5cf3-4abf-95dc-91c9622c76ab
2022/11/07 08:43:55 Job state: STOPPED
2022/11/07 08:43:55 Job state: STARTING
2022/11/07 08:43:55 Job state: RUNNING
2022/11/07 08:45:04 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/11/07 08:45:04 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/11/07 08:45:04 Job state: FAILED
2022/11/07 08:45:04 Failed to execute job: job load0tests0go0flink0batch0combine0101107065317-root-1107084354-faffd37a_d507d561-5cf3-4abf-95dc-91c9622c76ab failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101107065317-root-1107084354-faffd37a_d507d561-5cf3-4abf-95dc-91c9622c76ab failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x162bce8, 0xc00004a0d0}, {0x148e389?, 0x1fa6280?}, {0xc000141e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/3ov4cf2zltazq
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #703
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/703/display/redirect>
Changes:
------------------------------------------
[...truncated 34.04 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/06 08:44:01 Using specified **** binary: 'linux_amd64/combine'
2022/11/06 08:44:02 Prepared job with id: load-tests-go-flink-batch-combine-1-1106065313_832cd723-2e2c-43da-bbf0-08eeeafbf00e and staging token: load-tests-go-flink-batch-combine-1-1106065313_832cd723-2e2c-43da-bbf0-08eeeafbf00e
2022/11/06 08:44:06 Staged binary artifact with token:
2022/11/06 08:44:07 Submitted job: load0tests0go0flink0batch0combine0101106065313-root-1106084406-d31b374b_f67b5b36-5d67-4e29-8819-3cda0f7f5708
2022/11/06 08:44:07 Job state: STOPPED
2022/11/06 08:44:07 Job state: STARTING
2022/11/06 08:44:07 Job state: RUNNING
2022/11/06 08:45:16 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/11/06 08:45:16 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/11/06 08:45:17 Job state: FAILED
2022/11/06 08:45:17 Failed to execute job: job load0tests0go0flink0batch0combine0101106065313-root-1106084406-d31b374b_f67b5b36-5d67-4e29-8819-3cda0f7f5708 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101106065313-root-1106084406-d31b374b_f67b5b36-5d67-4e29-8819-3cda0f7f5708 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x162bce8, 0xc00004a0d0}, {0x148e389?, 0x1fa6280?}, {0xc0003c9e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 38s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/276txc3d3hvts
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #702
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/702/display/redirect?page=changes>
Changes:
[Robert Bradshaw] Move logging to its own module.
[Robert Bradshaw] Cleanup worker logging.
[Robert Bradshaw] Add basic counter setting and getting to the typescript SDK.
[Robert Bradshaw] Support metrics over the portability API.
[Robert Bradshaw] Add distribution metric type.
[Robert Bradshaw] old prettier change
[noreply] Improve Iterator error message (#23972)
[noreply] Update watermark during periodic sequence/impulse (#23507)
[noreply] TFX image classification example (#23456)
[noreply] Immediately truncate full restriction on drain of periodic impulse
[noreply] [Task]: PR Bot will push commits only if they are non-empty (#23937)
[noreply] Bump cloud.google.com/go/datastore from 1.8.0 to 1.9.0 in /sdks (#23916)
[Robert Bradshaw] Remove obsolete TODO.
[Robert Bradshaw] Only report counters that were actually used.
[noreply] Add custom inference fn suport to the sklearn model handlers (#23642)
[noreply] removed trailing whitespace (#23987)
[noreply] Beam starter projects blog post (#23964)
[noreply] Enable more portable-runner requiring tests. (#23970)
[noreply] Website add and update logos (#23899)
------------------------------------------
[...truncated 34.04 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/05 08:43:58 Using specified **** binary: 'linux_amd64/combine'
2022/11/05 08:43:59 Prepared job with id: load-tests-go-flink-batch-combine-1-1105065318_42d5939c-39ac-4214-b245-3a0f16478548 and staging token: load-tests-go-flink-batch-combine-1-1105065318_42d5939c-39ac-4214-b245-3a0f16478548
2022/11/05 08:44:03 Staged binary artifact with token:
2022/11/05 08:44:04 Submitted job: load0tests0go0flink0batch0combine0101105065318-root-1105084403-a3dc9de8_2bcb3135-d607-42b5-88f6-b69ecd62c7b1
2022/11/05 08:44:04 Job state: STOPPED
2022/11/05 08:44:04 Job state: STARTING
2022/11/05 08:44:04 Job state: RUNNING
2022/11/05 08:45:13 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/11/05 08:45:13 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/11/05 08:45:13 Job state: FAILED
2022/11/05 08:45:13 Failed to execute job: job load0tests0go0flink0batch0combine0101105065318-root-1105084403-a3dc9de8_2bcb3135-d607-42b5-88f6-b69ecd62c7b1 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101105065318-root-1105084403-a3dc9de8_2bcb3135-d607-42b5-88f6-b69ecd62c7b1 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x162bce8, 0xc00004a0d0}, {0x148e389?, 0x1fa6280?}, {0xc0003e7e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 40s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/byrv5wfheowzg
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #701
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/701/display/redirect?page=changes>
Changes:
[ahmedabualsaud] emit load job IDs as soon as they come up
[ahmedabualsaud] style fix
[Moritz Mack] [Spark dataset runner] Add direct translation of Combine.GroupedValues
[noreply] Concept guide on orchestrating Beam preprocessing (#23094)
[noreply] Initial draft of Batched DoFn user guide (#23909)
[noreply] WIP: Dataframe API ML preprocessing notebook (#22587)
[noreply] [Python] Added none check while accessing active_process_bundle (#23947)
[noreply] [Tour Of Beam] saving user code (#23938)
[noreply] Disable flaky fn_api_runner tests (#23971)
[noreply] Make BatchConverter inference errors more helpful (#23965)
------------------------------------------
[...truncated 33.97 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/04 08:43:47 Using specified **** binary: 'linux_amd64/combine'
2022/11/04 08:43:48 Prepared job with id: load-tests-go-flink-batch-combine-1-1104065324_3d54e258-8e14-4a16-aef5-8dcb41f49009 and staging token: load-tests-go-flink-batch-combine-1-1104065324_3d54e258-8e14-4a16-aef5-8dcb41f49009
2022/11/04 08:43:52 Staged binary artifact with token:
2022/11/04 08:43:53 Submitted job: load0tests0go0flink0batch0combine0101104065324-root-1104084352-ddfd5300_9777e7e0-2448-4c7c-8736-4ec58c9d3413
2022/11/04 08:43:53 Job state: STOPPED
2022/11/04 08:43:53 Job state: STARTING
2022/11/04 08:43:53 Job state: RUNNING
2022/11/04 08:45:02 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/11/04 08:45:02 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/11/04 08:45:02 Job state: FAILED
2022/11/04 08:45:02 Failed to execute job: job load0tests0go0flink0batch0combine0101104065324-root-1104084352-ddfd5300_9777e7e0-2448-4c7c-8736-4ec58c9d3413 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101104065324-root-1104084352-ddfd5300_9777e7e0-2448-4c7c-8736-4ec58c9d3413 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x162bce8, 0xc00004a0d0}, {0x148e389?, 0x1fa6280?}, {0xc000629e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/albud373alxgc
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #700
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/700/display/redirect?page=changes>
Changes:
[bulat.safiullin] [Website] add shuffle to logos
[Moritz Mack] Fix Spark 3 job-server jar path for Python test suites (closes #23935,
[noreply] Bump actions/setup-java from 3.5.1 to 3.6.0 (#23797)
[noreply] [CdapIO] Add integration tests for SparkReceiverIO (#23305)
[noreply] [Go] Ensure iterated and emitted types are registered. (#23890)
[chamikaramj] Updates Multi-language Java examples documentation
[noreply] [Python SDK] Re-enable PipelineOptionsTest.test_display_data (#23787)
[noreply] Unify PerformanceTest metric dashboard naming and series (#23914)
[noreply] Update REVIEWERS.yaml (#23955)
------------------------------------------
[...truncated 33.98 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/03 08:44:07 Using specified **** binary: 'linux_amd64/combine'
2022/11/03 08:44:08 Prepared job with id: load-tests-go-flink-batch-combine-1-1103065319_ae6b830c-c127-439b-9416-19572435c31f and staging token: load-tests-go-flink-batch-combine-1-1103065319_ae6b830c-c127-439b-9416-19572435c31f
2022/11/03 08:44:12 Staged binary artifact with token:
2022/11/03 08:44:13 Submitted job: load0tests0go0flink0batch0combine0101103065319-root-1103084412-2f9440af_ece1ed19-6bcc-4689-9647-a82e210b95a7
2022/11/03 08:44:13 Job state: STOPPED
2022/11/03 08:44:13 Job state: STARTING
2022/11/03 08:44:13 Job state: RUNNING
2022/11/03 08:45:22 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/11/03 08:45:22 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/11/03 08:45:22 Job state: FAILED
2022/11/03 08:45:22 Failed to execute job: job load0tests0go0flink0batch0combine0101103065319-root-1103084412-2f9440af_ece1ed19-6bcc-4689-9647-a82e210b95a7 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101103065319-root-1103084412-2f9440af_ece1ed19-6bcc-4689-9647-a82e210b95a7 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x162bce8, 0xc00004a0d0}, {0x148e389?, 0x1fa6280?}, {0xc0005f5e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 47s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/uenlyl2fwps7i
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #699
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/699/display/redirect?page=changes>
Changes:
[noreply] Remove Dataflow Portability test suite from mass_comment.py
[noreply] Add jupyter notebook for using RunInference with sklearn, pytorch and
[noreply] Add WriteParquetBatched (#23030)
[noreply] Validate if user exists for author (#23761)
[noreply] Add notebook for doing remote inference in Beam (#23887)
[noreply] Fix python examples tests not running in Dataflow (#23546)
[noreply] Add support for converting to/from pyarrow Arrays (#23894)
------------------------------------------
[...truncated 33.97 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/02 08:43:41 Using specified **** binary: 'linux_amd64/combine'
2022/11/02 08:43:42 Prepared job with id: load-tests-go-flink-batch-combine-1-1102065315_7109ae9c-ab98-420f-9cd8-9ca42b645138 and staging token: load-tests-go-flink-batch-combine-1-1102065315_7109ae9c-ab98-420f-9cd8-9ca42b645138
2022/11/02 08:43:48 Staged binary artifact with token:
2022/11/02 08:43:49 Submitted job: load0tests0go0flink0batch0combine0101102065315-root-1102084348-a0079baa_5f8d853d-4371-4a33-8c3b-efc635277614
2022/11/02 08:43:49 Job state: STOPPED
2022/11/02 08:43:49 Job state: STARTING
2022/11/02 08:43:49 Job state: RUNNING
2022/11/02 08:44:58 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/11/02 08:44:58 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/11/02 08:44:59 Job state: FAILED
2022/11/02 08:44:59 Failed to execute job: job load0tests0go0flink0batch0combine0101102065315-root-1102084348-a0079baa_5f8d853d-4371-4a33-8c3b-efc635277614 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101102065315-root-1102084348-a0079baa_5f8d853d-4371-4a33-8c3b-efc635277614 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x162bcc8, 0xc00012e000}, {0x148e389?, 0x1fa6280?}, {0xc000237e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/4xlu3goxqmk5u
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #698
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/698/display/redirect?page=changes>
Changes:
[yathu] Fix Beam Sql does not support CHAR, VARCHAR, BINARY, VARBINARY
[noreply] Add brief descriptions about end-to-end ML Pipelines (#23880)
[yathu] Remove debug leftover
[noreply] Disable `optimizeOuterThis` when building with JDK > 8 (#23902)
[noreply] [Playground] [Backend] Update playground cache component to increase
[noreply] Upgrade Akvelon editor (#23415) (#23900)
[noreply] [Website] update additional case studies layout and scss (#23555)
[noreply] [Website] add shuffle to logos (#23847)
[noreply] Clean-up DatastoreV1.java (#23892)
[noreply] Add LogElements as a Beam PTransform (#23879)
[noreply] Fix incorrect object size calculation in StateCache (#23000) (#23886)
------------------------------------------
[...truncated 34.00 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/11/01 08:43:49 Using specified **** binary: 'linux_amd64/combine'
2022/11/01 08:43:49 Prepared job with id: load-tests-go-flink-batch-combine-1-1101065319_98f2698e-5ddd-49cd-ad9f-26a6794b17d4 and staging token: load-tests-go-flink-batch-combine-1-1101065319_98f2698e-5ddd-49cd-ad9f-26a6794b17d4
2022/11/01 08:43:54 Staged binary artifact with token:
2022/11/01 08:43:55 Submitted job: load0tests0go0flink0batch0combine0101101065319-root-1101084354-8955d2ea_30aa1941-2e6c-480c-b0ba-c730f68f6e81
2022/11/01 08:43:55 Job state: STOPPED
2022/11/01 08:43:55 Job state: STARTING
2022/11/01 08:43:55 Job state: RUNNING
2022/11/01 08:45:04 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/11/01 08:45:04 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/11/01 08:45:04 Job state: FAILED
2022/11/01 08:45:04 Failed to execute job: job load0tests0go0flink0batch0combine0101101065319-root-1101084354-8955d2ea_30aa1941-2e6c-480c-b0ba-c730f68f6e81 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101101065319-root-1101084354-8955d2ea_30aa1941-2e6c-480c-b0ba-c730f68f6e81 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x162bcc8, 0xc00004a0d0}, {0x148e389?, 0x1fa6280?}, {0xc00044fe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/4pj2qdkgmmuu4
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #697
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/697/display/redirect?page=changes>
Changes:
[noreply] Fixing branch verification for Run RC Validation and Verify Release
[noreply] Fix link in `basics` (#23399)
------------------------------------------
[...truncated 33.87 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/31 08:43:54 Using specified **** binary: 'linux_amd64/combine'
2022/10/31 08:43:54 Prepared job with id: load-tests-go-flink-batch-combine-1-1031065318_df3faf34-7a8e-49e7-9efa-87c147bd543d and staging token: load-tests-go-flink-batch-combine-1-1031065318_df3faf34-7a8e-49e7-9efa-87c147bd543d
2022/10/31 08:43:59 Staged binary artifact with token:
2022/10/31 08:44:00 Submitted job: load0tests0go0flink0batch0combine0101031065318-root-1031084359-d475cb46_2eaf235f-acd8-4e02-b7b4-e15ff72bccc6
2022/10/31 08:44:00 Job state: STOPPED
2022/10/31 08:44:00 Job state: STARTING
2022/10/31 08:44:00 Job state: RUNNING
2022/10/31 08:45:09 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/31 08:45:09 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/31 08:45:10 Job state: FAILED
2022/10/31 08:45:10 Failed to execute job: job load0tests0go0flink0batch0combine0101031065318-root-1031084359-d475cb46_2eaf235f-acd8-4e02-b7b4-e15ff72bccc6 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101031065318-root-1031084359-d475cb46_2eaf235f-acd8-4e02-b7b4-e15ff72bccc6 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x162bcc8, 0xc00004a0d0}, {0x148e389?, 0x1fa6280?}, {0xc0005d9e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 38s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/dnl5vqdjtuxcw
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #696
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/696/display/redirect>
Changes:
------------------------------------------
[...truncated 34.06 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/30 08:43:59 Using specified **** binary: 'linux_amd64/combine'
2022/10/30 08:43:59 Prepared job with id: load-tests-go-flink-batch-combine-1-1030065318_e0b96bd9-2750-4cf5-8f07-bcd0103c8e3a and staging token: load-tests-go-flink-batch-combine-1-1030065318_e0b96bd9-2750-4cf5-8f07-bcd0103c8e3a
2022/10/30 08:44:04 Staged binary artifact with token:
2022/10/30 08:44:05 Submitted job: load0tests0go0flink0batch0combine0101030065318-root-1030084404-2fc68ca2_181c3807-e29a-4a9b-b3f5-325633d4f72d
2022/10/30 08:44:05 Job state: STOPPED
2022/10/30 08:44:05 Job state: STARTING
2022/10/30 08:44:05 Job state: RUNNING
2022/10/30 08:45:14 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/30 08:45:14 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/30 08:45:14 Job state: FAILED
2022/10/30 08:45:14 Failed to execute job: job load0tests0go0flink0batch0combine0101030065318-root-1030084404-2fc68ca2_181c3807-e29a-4a9b-b3f5-325633d4f72d failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101030065318-root-1030084404-2fc68ca2_181c3807-e29a-4a9b-b3f5-325633d4f72d failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x162bcc8, 0xc00004a0d0}, {0x148e389?, 0x1fa6280?}, {0xc000639e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 33s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/vytv3qoljgdlm
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #695
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/695/display/redirect?page=changes>
Changes:
[bulat.safiullin] [Website] update calendar section mobile classes #22694
[Robert Bradshaw] More bigquery native sink cleanup.
[noreply] Fix BigQueryIO Performance Test Streaming (#23857)
[noreply] adding examples in schema transforms section of programming guide for
------------------------------------------
[...truncated 34.06 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/29 08:44:03 Using specified **** binary: 'linux_amd64/combine'
2022/10/29 08:44:03 Prepared job with id: load-tests-go-flink-batch-combine-1-1029065317_ca74164d-c6bd-43f0-8205-45cac61741a4 and staging token: load-tests-go-flink-batch-combine-1-1029065317_ca74164d-c6bd-43f0-8205-45cac61741a4
2022/10/29 08:44:07 Staged binary artifact with token:
2022/10/29 08:44:08 Submitted job: load0tests0go0flink0batch0combine0101029065317-root-1029084407-6ecd41ba_378613f5-5c62-40cc-b97b-f869e363f0dc
2022/10/29 08:44:08 Job state: STOPPED
2022/10/29 08:44:08 Job state: STARTING
2022/10/29 08:44:08 Job state: RUNNING
2022/10/29 08:45:17 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/29 08:45:17 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/29 08:45:17 Job state: FAILED
2022/10/29 08:45:17 Failed to execute job: job load0tests0go0flink0batch0combine0101029065317-root-1029084407-6ecd41ba_378613f5-5c62-40cc-b97b-f869e363f0dc failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101029065317-root-1029084407-6ecd41ba_378613f5-5c62-40cc-b97b-f869e363f0dc failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x162bcc8, 0xc00004a0d0}, {0x148e389?, 0x1fa6280?}, {0xc0002fbe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 39s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/nkmuuj6vhtefs
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #694
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/694/display/redirect?page=changes>
Changes:
[Heejong Lee] [BEAM-23836] Updating documentation for cross-language Java pipelines
[Heejong Lee] update
[Heejong Lee] update
[Heejong Lee] update
[Alexey Romanenko] [23832] Update CHANGES.md
[noreply] [Tour Of Beam] User authorization part 1 (#23782)
[noreply] [BEAM-23815] Fix Neo4j tests. (#23862)
[noreply] Add `arrow_type_compatibility` with `pyarrow.Table` to Beam Row
[noreply] Reduce log spam of Py37PostCommit (#23829)
[noreply] Actually use the DatsetService that will be auto-closed (#23873)
[noreply] Migrate BINARY, VARBINARY, CHAR, VARCHAR jdbc logical types to portable
[noreply] [BEAM-12164] Feat: Added SpannerChangeStreamIT to Cloud Spanner Change
[noreply] Use --release 8 for builds targeting Java 8 (#23771)
------------------------------------------
[...truncated 34.07 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/28 08:44:03 Using specified **** binary: 'linux_amd64/combine'
2022/10/28 08:44:04 Prepared job with id: load-tests-go-flink-batch-combine-1-1028065327_803430dc-f6c8-4b16-9bdd-a275e9f6ae29 and staging token: load-tests-go-flink-batch-combine-1-1028065327_803430dc-f6c8-4b16-9bdd-a275e9f6ae29
2022/10/28 08:44:09 Staged binary artifact with token:
2022/10/28 08:44:10 Submitted job: load0tests0go0flink0batch0combine0101028065327-root-1028084409-ef3dc928_fe27ef2e-5aea-4a48-a5b5-949fb45b5dec
2022/10/28 08:44:10 Job state: STOPPED
2022/10/28 08:44:10 Job state: STARTING
2022/10/28 08:44:10 Job state: RUNNING
2022/10/28 08:45:19 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/28 08:45:19 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/28 08:45:19 Job state: FAILED
2022/10/28 08:45:19 Failed to execute job: job load0tests0go0flink0batch0combine0101028065327-root-1028084409-ef3dc928_fe27ef2e-5aea-4a48-a5b5-949fb45b5dec failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101028065327-root-1028084409-ef3dc928_fe27ef2e-5aea-4a48-a5b5-949fb45b5dec failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x162bcc8, 0xc00004a0d0}, {0x148e389?, 0x1fa6280?}, {0xc000645e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 36s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/tpw64lekt6eqk
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #693
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/693/display/redirect?page=changes>
Changes:
[Moritz Mack] Bump dropwizard metrics-core for Spark 3 runner to match the version
[Moritz Mack] Remove obsolete code from Spark 3 runner.
[noreply] Fixing Get Started header link (#23490)
[noreply] Bump cloud.google.com/go/bigquery from 1.42.0 to 1.43.0 in /sdks
[noreply] Bump google.golang.org/api from 0.100.0 to 0.101.0 in /sdks (#23842)
[noreply] Update ReadDataFromKinesis URN to registered URN (fixes #23693) (#23849)
[noreply] [23832] Remove ParquetIO.withSplit (closes #23832) (#23833)
[noreply] Bump github.com/spf13/cobra from 1.6.0 to 1.6.1 in /sdks (#23822)
[noreply] [Go SDK] Add tests to the metrics package (#23769)
[noreply] Bump cloud.google.com/go/pubsub from 1.25.1 to 1.26.0 in /sdks (#23823)
[noreply] Updated documentation to point to notebooks instead of having samples
[noreply] Post 2.42.0 Updates to release guide (#23672)
[noreply] Add Go usage instructions to download page. (#23698)
[noreply] Deactivate Dask Runner code coverage tests as workaround (#23841)
[noreply] Use Akvelon editor (#23415) (#23825)
------------------------------------------
[...truncated 33.94 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/27 08:43:51 Using specified **** binary: 'linux_amd64/combine'
2022/10/27 08:43:51 Prepared job with id: load-tests-go-flink-batch-combine-1-1027065316_b71cd463-6b87-4698-9ccf-f05ba519bfb2 and staging token: load-tests-go-flink-batch-combine-1-1027065316_b71cd463-6b87-4698-9ccf-f05ba519bfb2
2022/10/27 08:43:56 Staged binary artifact with token:
2022/10/27 08:43:57 Submitted job: load0tests0go0flink0batch0combine0101027065316-root-1027084356-7174e89_8ee31fc5-d308-4429-b456-c728b591aecd
2022/10/27 08:43:58 Job state: STOPPED
2022/10/27 08:43:58 Job state: STARTING
2022/10/27 08:43:58 Job state: RUNNING
2022/10/27 08:45:07 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/27 08:45:07 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/27 08:45:07 Job state: FAILED
2022/10/27 08:45:07 Failed to execute job: job load0tests0go0flink0batch0combine0101027065316-root-1027084356-7174e89_8ee31fc5-d308-4429-b456-c728b591aecd failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101027065316-root-1027084356-7174e89_8ee31fc5-d308-4429-b456-c728b591aecd failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x162bcc8, 0xc00012e000}, {0x148e389?, 0x1fa6280?}, {0xc0000f7e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 43s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/wpazdffljwo6s
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #692
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/692/display/redirect?page=changes>
Changes:
[Kenneth Knowles] Enable checkerframework by default
[noreply] granting ruslan shamunov triage rights (#23806)
[noreply] Bump google.golang.org/api from 0.99.0 to 0.100.0 in /sdks (#23718)
[noreply] Initial DaskRunner for Beam (#22421)
[noreply] [Website] update PULL_REQUEST_TEMPLATE.md (#23576)
[noreply] [Website] change width of the additional case studies cards (#23824)
[chamikaramj] Adds a dependency to Python Multi-language library to the GCP Bom
[noreply] Support keyed executors in Samza Runner to process bundles for stateful
------------------------------------------
[...truncated 33.98 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/26 08:44:01 Using specified **** binary: 'linux_amd64/combine'
2022/10/26 08:44:01 Prepared job with id: load-tests-go-flink-batch-combine-1-1026065317_95fd2516-706a-48fd-b10e-c949fc70bd24 and staging token: load-tests-go-flink-batch-combine-1-1026065317_95fd2516-706a-48fd-b10e-c949fc70bd24
2022/10/26 08:44:06 Staged binary artifact with token:
2022/10/26 08:44:07 Submitted job: load0tests0go0flink0batch0combine0101026065317-root-1026084406-f43a8899_0945b0c3-9874-44ee-b38c-a803fc33a8a7
2022/10/26 08:44:07 Job state: STOPPED
2022/10/26 08:44:07 Job state: STARTING
2022/10/26 08:44:07 Job state: RUNNING
2022/10/26 08:45:16 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/26 08:45:16 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/26 08:45:16 Job state: FAILED
2022/10/26 08:45:16 Failed to execute job: job load0tests0go0flink0batch0combine0101026065317-root-1026084406-f43a8899_0945b0c3-9874-44ee-b38c-a803fc33a8a7 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101026065317-root-1026084406-f43a8899_0945b0c3-9874-44ee-b38c-a803fc33a8a7 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x162bcc8, 0xc00004a0d0}, {0x148e389?, 0x1fa62a0?}, {0xc000405e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 46s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/xo5pauzpzy5oi
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #691
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/691/display/redirect?page=changes>
Changes:
[Andrew Pilloud] Publish Python nexmark metrics to influxdb
[Andrew Pilloud] Exclude nexmark from codecov, it has no tests
[noreply] Updated ipywidgets
[yathu] Bump dataflow java fn container version to beam-master-20221022
[Moritz Mack] Update remaining pointers to Spark runner to Spark 3 module (addresses
[noreply] Ignoring BigQuery partitions with empty files (#23710)
[noreply] Benchmarking RunInference Example (#23554)
[Kiley Sok] Increate timeout for test pipelines
[noreply] Bump Dataflow python containers to 20221021 (#23807)
[noreply] Allow MoreFutures.allAsList/allAsListWithExceptions to have the passed
------------------------------------------
[...truncated 33.93 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/25 08:44:01 Using specified **** binary: 'linux_amd64/combine'
2022/10/25 08:44:01 Prepared job with id: load-tests-go-flink-batch-combine-1-1025065321_2d628388-aede-40d9-b3d4-b6fd49bb8c31 and staging token: load-tests-go-flink-batch-combine-1-1025065321_2d628388-aede-40d9-b3d4-b6fd49bb8c31
2022/10/25 08:44:06 Staged binary artifact with token:
2022/10/25 08:44:07 Submitted job: load0tests0go0flink0batch0combine0101025065321-root-1025084406-1f4647d4_a7b46865-ba9c-4e93-b26a-b4f5e9062756
2022/10/25 08:44:07 Job state: STOPPED
2022/10/25 08:44:07 Job state: STARTING
2022/10/25 08:44:07 Job state: RUNNING
2022/10/25 08:45:17 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/25 08:45:17 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/25 08:45:17 Job state: FAILED
2022/10/25 08:45:17 Failed to execute job: job load0tests0go0flink0batch0combine0101025065321-root-1025084406-1f4647d4_a7b46865-ba9c-4e93-b26a-b4f5e9062756 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101025065321-root-1025084406-1f4647d4_a7b46865-ba9c-4e93-b26a-b4f5e9062756 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x162ac08, 0xc00004a0d0}, {0x148d308?, 0x1fa5240?}, {0xc00077fe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 38s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/2qjvb6iojrqza
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #690
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/690/display/redirect?page=changes>
Changes:
[noreply] Remove unnecessary dependencies from jpms test (#23775)
[noreply] Use Spark 3 job-server as default Spark job-server for PortableRunner
[noreply] Support usage of custom profileName with AWS ProfileCredentialsProvider
[noreply] Migrate examples and maven-archetypes (including Java Quickstart) to
------------------------------------------
[...truncated 33.93 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/24 08:43:58 Using specified **** binary: 'linux_amd64/combine'
2022/10/24 08:43:58 Prepared job with id: load-tests-go-flink-batch-combine-1-1024065314_95a02c97-054b-49bc-8a42-9cd3ebb91d3f and staging token: load-tests-go-flink-batch-combine-1-1024065314_95a02c97-054b-49bc-8a42-9cd3ebb91d3f
2022/10/24 08:44:02 Staged binary artifact with token:
2022/10/24 08:44:04 Submitted job: load0tests0go0flink0batch0combine0101024065314-root-1024084403-4d2af0e9_c82f1ae0-4890-4e19-89b2-d7bac87bcb47
2022/10/24 08:44:04 Job state: STOPPED
2022/10/24 08:44:04 Job state: STARTING
2022/10/24 08:44:04 Job state: RUNNING
2022/10/24 08:45:13 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/24 08:45:13 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/24 08:45:13 Job state: FAILED
2022/10/24 08:45:13 Failed to execute job: job load0tests0go0flink0batch0combine0101024065314-root-1024084403-4d2af0e9_c82f1ae0-4890-4e19-89b2-d7bac87bcb47 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101024065314-root-1024084403-4d2af0e9_c82f1ae0-4890-4e19-89b2-d7bac87bcb47 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x162ac08, 0xc00012e000}, {0x148d308?, 0x1fa5240?}, {0xc0006d9e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 38s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/xudk3k2j4dkdc
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #689
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/689/display/redirect?page=changes>
Changes:
[noreply] Merge pull request #23556: Forward failed storage-api row inserts to the
------------------------------------------
[...truncated 33.87 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/23 08:43:42 Using specified **** binary: 'linux_amd64/combine'
2022/10/23 08:43:43 Prepared job with id: load-tests-go-flink-batch-combine-1-1023065316_1f8d30ec-f68b-4175-94d4-913fe4b8072f and staging token: load-tests-go-flink-batch-combine-1-1023065316_1f8d30ec-f68b-4175-94d4-913fe4b8072f
2022/10/23 08:43:47 Staged binary artifact with token:
2022/10/23 08:43:48 Submitted job: load0tests0go0flink0batch0combine0101023065316-root-1023084347-301e6b1f_e2f18187-a1d1-451d-9e37-1a2cb83567c2
2022/10/23 08:43:48 Job state: STOPPED
2022/10/23 08:43:48 Job state: STARTING
2022/10/23 08:43:48 Job state: RUNNING
2022/10/23 08:44:58 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/23 08:44:58 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/23 08:44:58 Job state: FAILED
2022/10/23 08:44:58 Failed to execute job: job load0tests0go0flink0batch0combine0101023065316-root-1023084347-301e6b1f_e2f18187-a1d1-451d-9e37-1a2cb83567c2 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101023065316-root-1023084347-301e6b1f_e2f18187-a1d1-451d-9e37-1a2cb83567c2 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x162ac08, 0xc000218000}, {0x148d308?, 0x1fa5240?}, {0xc0001e9e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/wuhujackrv5bu
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #688
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/688/display/redirect?page=changes>
Changes:
[Moritz Mack] Remove obsolete sparkRunner task from hadoop-format: not triggered, no
[noreply] bugfix/wrong-notebook-linl (#23777)
[noreply] [CdapIO] Integration CdapIO with SparkReceiverIO (#22584)
[noreply] Avoid Circular imports related to bigquery_schema_tools (#23731)
[noreply] Use Flink 1.13 for load tests (#23767)
[Kenneth Knowles] Re-enable PubsubTableProviderIT.testSQLSelectsArrayAttributes
[noreply] Remove obsolete native text io translation. (#23549)
[noreply] Eliminate nullness errors from GenerateSequence (#23744)
[noreply] Add logos to case-studies "Also Used By" (#23781)
[noreply] Avoid pickling unstable reference to moved proto classes. (#23739)
[Robert Bradshaw] Unskip test_generated_class_pickle for cloudpickle.
[noreply] Allow local packages in requirements.txt dependency list. (#23684)
[noreply] Revert "Update BQIO to a single scheduled executor service reduce
[noreply] Updates Python test expansion service to use Cloud Pickle (#23786)
[noreply] Merge pull request #23795: Revert 23234: issue #23794
------------------------------------------
[...truncated 33.96 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/22 08:44:19 Using specified **** binary: 'linux_amd64/combine'
2022/10/22 08:44:20 Prepared job with id: load-tests-go-flink-batch-combine-1-1022065332_bac8b625-9a7a-4d01-8e4e-d7b7497ef8a2 and staging token: load-tests-go-flink-batch-combine-1-1022065332_bac8b625-9a7a-4d01-8e4e-d7b7497ef8a2
2022/10/22 08:44:25 Staged binary artifact with token:
2022/10/22 08:44:26 Submitted job: load0tests0go0flink0batch0combine0101022065332-root-1022084425-ce84794a_07fec5f3-f847-4736-baee-7063feb2a81e
2022/10/22 08:44:26 Job state: STOPPED
2022/10/22 08:44:26 Job state: STARTING
2022/10/22 08:44:26 Job state: RUNNING
2022/10/22 08:45:36 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/22 08:45:36 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/22 08:45:36 Job state: FAILED
2022/10/22 08:45:36 Failed to execute job: job load0tests0go0flink0batch0combine0101022065332-root-1022084425-ce84794a_07fec5f3-f847-4736-baee-7063feb2a81e failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101022065332-root-1022084425-ce84794a_07fec5f3-f847-4736-baee-7063feb2a81e failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x162ac08, 0xc00012e000}, {0x148d308?, 0x1fa5240?}, {0xc0006ebe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 40s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/5tfryqit3zm5y
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #687
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/687/display/redirect?page=changes>
Changes:
[noreply] Issue#23599 Updated dataframe notebook
[noreply] Added a missing line break.
[Kenneth Knowles] Verify that secondary key coder is deterministic in SortValues
[Chamikara Madhusanka Jayalath] Updating Python dependencies for the 2.43.0 release
[thiagotnunes] tests: fixes SpannerIO unavailable retry test
[noreply] Remove yeandy from reviewers (#23753)
[noreply] Revert bigdataoss version upgrade (#23727)
[Chamikara Madhusanka Jayalath] Moving to 2.44.0-SNAPSHOT on master branch.
[noreply] Update the timeout in ValidatesContainer suite. (#23732)
[riteshghorse] fix lints
[noreply] Update google cloud vision >= 2.0.0 (#23755)
[noreply] Update GcsIO initialization to support converting input parameters to
[noreply] Adds instructions for running the Multi-language Java quickstart from
[noreply] Remove Spark2 from Java testing projects (addresses #23728) (#23749)
------------------------------------------
[...truncated 33.75 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/21 08:43:45 Using specified **** binary: 'linux_amd64/combine'
2022/10/21 08:43:45 Prepared job with id: load-tests-go-flink-batch-combine-1-1021065324_521a32aa-ed4d-42ee-b80d-1511955d4fd7 and staging token: load-tests-go-flink-batch-combine-1-1021065324_521a32aa-ed4d-42ee-b80d-1511955d4fd7
2022/10/21 08:43:50 Staged binary artifact with token:
2022/10/21 08:43:52 Submitted job: load0tests0go0flink0batch0combine0101021065324-root-1021084351-5f971342_60528140-7d74-4569-a5ec-99335a2f7dfe
2022/10/21 08:43:52 Job state: STOPPED
2022/10/21 08:43:52 Job state: STARTING
2022/10/21 08:43:52 Job state: RUNNING
2022/10/21 08:45:01 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:132)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:99)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/21 08:45:01 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/21 08:45:01 Job state: FAILED
2022/10/21 08:45:01 Failed to execute job: job load0tests0go0flink0batch0combine0101021065324-root-1021084351-5f971342_60528140-7d74-4569-a5ec-99335a2f7dfe failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101021065324-root-1021084351-5f971342_60528140-7d74-4569-a5ec-99335a2f7dfe failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x162ac08, 0xc00004a0d0}, {0x148d308?, 0x1fa5240?}, {0xc000331e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 40s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/6hpmxke4tw2je
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #686
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/686/display/redirect?page=changes>
Changes:
[Moritz Mack] Keep Spark version in a single place only (BeamModulePlugin)
[noreply] [Playground] Examples CD (#23664)
[noreply] Update release instructions in Python 3.10 (#23702)
[noreply] Move Tensorflow Documentation (#23729)
[noreply] Bump golang.org/x/text from 0.3.7 to 0.4.0 in /sdks (#23686)
[noreply] Unit Content markdown styles (#23592) (#23662)
[noreply] Add reopen issue command (#23733)
[noreply] Add example of real time Anomaly Detection using RunInference (#23497)
[noreply] Support TIMESTAMP type in BigQueryIO with BEAM_ROW output type, and in
[noreply] Add PytorchBatchConverter (#23296)
[noreply] Pin version to grpcio in build-requirements.txt (#23735)
[noreply] Bump up python container versions. (#23716)
[noreply] Reduce log flood in Python PostCommit flink task (#23635)
[noreply] Speed up check on website links (#23737)
------------------------------------------
[...truncated 33.87 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/20 08:43:52 Using specified **** binary: 'linux_amd64/combine'
2022/10/20 08:43:52 Prepared job with id: load-tests-go-flink-batch-combine-1-1020082244_b30aaa0f-1db1-452b-a46d-14f644d74925 and staging token: load-tests-go-flink-batch-combine-1-1020082244_b30aaa0f-1db1-452b-a46d-14f644d74925
2022/10/20 08:43:57 Staged binary artifact with token:
2022/10/20 08:43:58 Submitted job: load0tests0go0flink0batch0combine0101020082244-root-1020084357-ff409238_862d035f-f15f-4d66-b1ab-98e063136929
2022/10/20 08:43:58 Job state: STOPPED
2022/10/20 08:43:58 Job state: STARTING
2022/10/20 08:43:58 Job state: RUNNING
2022/10/20 08:45:07 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/20 08:45:07 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/20 08:45:08 Job state: FAILED
2022/10/20 08:45:08 Failed to execute job: job load0tests0go0flink0batch0combine0101020082244-root-1020084357-ff409238_862d035f-f15f-4d66-b1ab-98e063136929 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101020082244-root-1020084357-ff409238_862d035f-f15f-4d66-b1ab-98e063136929 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x162ac08, 0xc00004a0d0}, {0x148d308?, 0x1fa5240?}, {0xc000179e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 38s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/rm77hfuq6j33s
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #685
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/685/display/redirect?page=changes>
Changes:
[Robert Bradshaw] Remove numpy C API dep from public declarations.
[Andrew Pilloud] Migrate nexmark to common config for cron jobs
[Kiley Sok] beam-perf
[Kiley Sok] fix
[noreply] [GitHub Actions] - Run RC Validations Workflow (#23531)
[noreply] Add workflow to update milestone on issue close (#23629)
[noreply] add website page about data processing for ML (#23552)
[noreply] [Go SDK] Dataframe API wrapper (#23450)
[noreply] [Go SDK]: Adds Automated Python Expansion Service (#23582)
[noreply] Include CombineFn's in __all__ (#23685)
[noreply] Bump google.golang.org/grpc from 1.50.0 to 1.50.1 in /sdks (#23654)
[noreply] [Playground][Frontend] Tags filter for Examples Catalog (#22074)
[noreply] [Go SDK] Extract output coders in expandCrossLanguage (#23641)
[noreply] Python 3.10 support (#23587)
[noreply] Fixes #22192: Avoids nullpointer error. Preserves previous behavior.
[noreply] Deflaking tests for BQ row insertions. These tests were flaky due to
[noreply] Add java 11 home to jenkins test (#23708)
[noreply] enable automatic expansion service (#23699)
[noreply] add expansion service option (#23712)
[noreply] Downgrade container cryptography version to avoid yanked version
[noreply] Update portable runner test timeout (#23696)
[noreply] Merge pull request #23510: Vortex multiplexing streams
[noreply] Io jms fix ack message checkpoint (#22932)
------------------------------------------
[...truncated 33.75 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/19 08:43:48 Using specified **** binary: 'linux_amd64/combine'
2022/10/19 08:43:48 Prepared job with id: load-tests-go-flink-batch-combine-1-1019065345_db85f437-f5fa-40f3-b8b2-e612817fd9a3 and staging token: load-tests-go-flink-batch-combine-1-1019065345_db85f437-f5fa-40f3-b8b2-e612817fd9a3
2022/10/19 08:43:53 Staged binary artifact with token:
2022/10/19 08:43:54 Submitted job: load0tests0go0flink0batch0combine0101019065345-root-1019084353-c9f4c2e4_0b4acca2-e0d1-4190-8f77-a8b723cb3f4a
2022/10/19 08:43:54 Job state: STOPPED
2022/10/19 08:43:54 Job state: STARTING
2022/10/19 08:43:54 Job state: RUNNING
2022/10/19 08:45:03 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/19 08:45:03 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/19 08:45:03 Job state: FAILED
2022/10/19 08:45:03 Failed to execute job: job load0tests0go0flink0batch0combine0101019065345-root-1019084353-c9f4c2e4_0b4acca2-e0d1-4190-8f77-a8b723cb3f4a failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101019065345-root-1019084353-c9f4c2e4_0b4acca2-e0d1-4190-8f77-a8b723cb3f4a failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x162ac08, 0xc00012e000}, {0x148d308?, 0x1fa5240?}, {0xc000457e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 48s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/nab6eocsioulu
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #684
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/684/display/redirect?page=changes>
Changes:
[Moritz Mack] [Jenkins,Spark] Stop running Nexmark suite for deprecated Spark 2 runner
[noreply] [Playground] Examples CI (#23476)
[noreply] [Tour Of Beam] README update (#23318)
[noreply] Bump google.golang.org/api from 0.98.0 to 0.99.0 in /sdks (#23655)
[noreply] Fix beam_PerformanceTests_PubsubIOIT_Python_Streaming (#23607)
[Alexey Romanenko] [TPC-DS] Use "nonpartitioned" input for Jenkins jobs
[noreply] 2.42.0 Release Blog Post (#23406)
[noreply] Docs for state in go (#22965)
[noreply] Fix typo in 2.42.0 blog.
[noreply] Adjust 2.42.0 publishing time.
[noreply] Adds a Java RunInference example (#23619)
[noreply] Fixes #23627: Speed up website checks (#23673)
[noreply] Suppress a FloatingPointLiteralPrecision error (#23667)
[noreply] Improved test coverage and fix the implementation of Inject and CoGBK
[noreply] Fix python log_level_overrides cannot be used on flink and other
[noreply] Better error for disabling runner v2 with cross language pipelines.
[noreply] Update scopes to match the existing set and update test to clear
------------------------------------------
[...truncated 33.74 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/18 08:44:41 Using specified **** binary: 'linux_amd64/combine'
2022/10/18 08:44:41 Prepared job with id: load-tests-go-flink-batch-combine-1-1018080345_79af2df3-2ad5-423e-bb73-8bdfa962944e and staging token: load-tests-go-flink-batch-combine-1-1018080345_79af2df3-2ad5-423e-bb73-8bdfa962944e
2022/10/18 08:44:46 Staged binary artifact with token:
2022/10/18 08:44:47 Submitted job: load0tests0go0flink0batch0combine0101018080345-root-1018084446-8a4ab3aa_bf72b366-364a-4385-907f-5fe52a6596fc
2022/10/18 08:44:47 Job state: STOPPED
2022/10/18 08:44:47 Job state: STARTING
2022/10/18 08:44:47 Job state: RUNNING
2022/10/18 08:45:55 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/18 08:45:55 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/18 08:45:56 Job state: FAILED
2022/10/18 08:45:56 Failed to execute job: job load0tests0go0flink0batch0combine0101018080345-root-1018084446-8a4ab3aa_bf72b366-364a-4385-907f-5fe52a6596fc failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101018080345-root-1018084446-8a4ab3aa_bf72b366-364a-4385-907f-5fe52a6596fc failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1629448, 0xc0001a6000}, {0x148be28?, 0x1fa3200?}, {0xc000313e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 54s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/eizhx4pcbd2js
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #683
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/683/display/redirect?page=changes>
Changes:
[noreply] Blog post for Hop web in Google Cloud (#23652)
------------------------------------------
[...truncated 33.68 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/17 08:43:28 Using specified **** binary: 'linux_amd64/combine'
2022/10/17 08:43:28 Prepared job with id: load-tests-go-flink-batch-combine-1-1017065320_3ce44bcd-1379-4c8e-a2e9-b20f71ef647f and staging token: load-tests-go-flink-batch-combine-1-1017065320_3ce44bcd-1379-4c8e-a2e9-b20f71ef647f
2022/10/17 08:43:33 Staged binary artifact with token:
2022/10/17 08:43:34 Submitted job: load0tests0go0flink0batch0combine0101017065320-root-1017084333-c6766857_673c144b-1caf-4c1d-ad85-bb6dd6356af8
2022/10/17 08:43:34 Job state: STOPPED
2022/10/17 08:43:34 Job state: STARTING
2022/10/17 08:43:34 Job state: RUNNING
2022/10/17 08:44:43 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/17 08:44:43 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/17 08:44:43 Job state: FAILED
2022/10/17 08:44:43 Failed to execute job: job load0tests0go0flink0batch0combine0101017065320-root-1017084333-c6766857_673c144b-1caf-4c1d-ad85-bb6dd6356af8 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101017065320-root-1017084333-c6766857_673c144b-1caf-4c1d-ad85-bb6dd6356af8 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x16291a8, 0xc0001a8000}, {0x148bd11?, 0x1fa3200?}, {0xc0003d9e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 38s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/fgp77ah57zcrw
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #682
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/682/display/redirect?page=changes>
Changes:
[noreply] [GitHub Actions] - Verify Release Build Workflow (#23390)
------------------------------------------
[...truncated 33.65 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/16 08:43:12 Using specified **** binary: 'linux_amd64/combine'
2022/10/16 08:43:12 Prepared job with id: load-tests-go-flink-batch-combine-1-1016065323_0a768800-90a3-48f4-99a5-ceede79844e4 and staging token: load-tests-go-flink-batch-combine-1-1016065323_0a768800-90a3-48f4-99a5-ceede79844e4
2022/10/16 08:43:16 Staged binary artifact with token:
2022/10/16 08:43:17 Submitted job: load0tests0go0flink0batch0combine0101016065323-root-1016084316-c605174f_50fb7287-ee73-4f4a-9fc5-477ec92c6d20
2022/10/16 08:43:17 Job state: STOPPED
2022/10/16 08:43:17 Job state: STARTING
2022/10/16 08:43:17 Job state: RUNNING
2022/10/16 08:44:26 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/16 08:44:26 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/16 08:44:26 Job state: FAILED
2022/10/16 08:44:26 Failed to execute job: job load0tests0go0flink0batch0combine0101016065323-root-1016084316-c605174f_50fb7287-ee73-4f4a-9fc5-477ec92c6d20 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101016065323-root-1016084316-c605174f_50fb7287-ee73-4f4a-9fc5-477ec92c6d20 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x16291a8, 0xc00012e000}, {0x148bd11?, 0x1fa3200?}, {0xc0006f1e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/nco2s3jexu6gy
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #681
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/681/display/redirect?page=changes>
Changes:
[Moritz Mack] Minor improvements to the tpcds gradle build for Spark
[Moritz Mack] Fix SparkSessionFactory to not fail when using Spark master local[*]
[Moritz Mack] [Spark dataset runner] Add direct translation of Reshuffle and
[noreply] Make GCP OAuth scopes configurable via pipeline options. (#23644)
[noreply] Update BQIO to a single scheduled executor service reduce threads
------------------------------------------
[...truncated 33.88 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/15 08:43:41 Using specified **** binary: 'linux_amd64/combine'
2022/10/15 08:43:41 Prepared job with id: load-tests-go-flink-batch-combine-1-1015065321_b81ac577-0c3a-4368-b1d6-2797580fed74 and staging token: load-tests-go-flink-batch-combine-1-1015065321_b81ac577-0c3a-4368-b1d6-2797580fed74
2022/10/15 08:43:45 Staged binary artifact with token:
2022/10/15 08:43:46 Submitted job: load0tests0go0flink0batch0combine0101015065321-root-1015084345-9d7fb9db_fa76bd34-4df5-471a-a00b-19af0a93ec94
2022/10/15 08:43:46 Job state: STOPPED
2022/10/15 08:43:46 Job state: STARTING
2022/10/15 08:43:46 Job state: RUNNING
2022/10/15 08:44:55 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/15 08:44:55 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/15 08:44:55 Job state: FAILED
2022/10/15 08:44:55 Failed to execute job: job load0tests0go0flink0batch0combine0101015065321-root-1015084345-9d7fb9db_fa76bd34-4df5-471a-a00b-19af0a93ec94 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101015065321-root-1015084345-9d7fb9db_fa76bd34-4df5-471a-a00b-19af0a93ec94 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x16291a8, 0xc00012e000}, {0x148bd11?, 0x1fa3200?}, {0xc000363e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 38s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/kynoro63ibjom
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #680
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/680/display/redirect?page=changes>
Changes:
[Kiley Sok] Add agent to open modules
[Kiley Sok] check for empty
[Kiley Sok] limit to jamm and update comments
[Kiley Sok] reuse options, pr comments
[rszper] Added content: The direct runner is not suited to production pipelines
[yixiaoshen] Remove artificial timeout in FirestoreV1IT, Dataflow runner is very slow
[Moritz Mack] Align translation logging for Spark dataset runner with rdd runner for
[noreply] Use new github output format (#23624)
[noreply] Tour of Beam frontend state management (#23420) (#23572)
[noreply] Update
[noreply] Merge pull request #23524: Adding beam blog info to the Community page
[noreply] Update publish_release_notes to generate PR list (#23630)
[noreply] Bump Legacy dataflow container image tag (#23625)
------------------------------------------
[...truncated 33.77 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/14 08:43:31 Using specified **** binary: 'linux_amd64/combine'
2022/10/14 08:43:32 Prepared job with id: load-tests-go-flink-batch-combine-1-1014065321_d1a75bb6-e709-4500-aace-e22ffb37e938 and staging token: load-tests-go-flink-batch-combine-1-1014065321_d1a75bb6-e709-4500-aace-e22ffb37e938
2022/10/14 08:43:36 Staged binary artifact with token:
2022/10/14 08:43:37 Submitted job: load0tests0go0flink0batch0combine0101014065321-root-1014084336-939e0428_157f922c-1be4-485a-ae0b-483158becdaf
2022/10/14 08:43:37 Job state: STOPPED
2022/10/14 08:43:37 Job state: STARTING
2022/10/14 08:43:37 Job state: RUNNING
2022/10/14 08:44:46 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/14 08:44:46 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/14 08:44:46 Job state: FAILED
2022/10/14 08:44:46 Failed to execute job: job load0tests0go0flink0batch0combine0101014065321-root-1014084336-939e0428_157f922c-1be4-485a-ae0b-483158becdaf failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101014065321-root-1014084336-939e0428_157f922c-1be4-485a-ae0b-483158becdaf failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x16291a8, 0xc00004a0c0}, {0x148bd11?, 0x1fa3200?}, {0xc0003f5e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 41s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/mjyedvbtwgqmu
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #679
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/679/display/redirect?page=changes>
Changes:
[Alexey Romanenko] [website][adhoc] Fix spellcheck errors and typos
[noreply] Migrate GcsOptions#getExecutorService to an unbounded
[noreply] (BQ Java) Explicitly set coder for multi-partition batch load writes
[noreply] Fix typo in bootstrap_beam_venv.py (#23574)
[noreply] Bump github.com/spf13/cobra from 1.5.0 to 1.6.0 in /sdks (#23591)
[noreply] [Playground][Tour Of Beam] Datastore entities split by origin (#23088)
[noreply] use write schema only for read api (#23594)
[noreply] [Go SDK]: SingleFlight bundle descriptor requests (#23589)
[noreply] Extend a timeout to create a bt cluster. (#23617)
------------------------------------------
[...truncated 33.82 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/13 08:43:45 Using specified **** binary: 'linux_amd64/combine'
2022/10/13 08:43:46 Prepared job with id: load-tests-go-flink-batch-combine-1-1013065314_c65eb0bd-381b-4f19-b7cd-3b820502673a and staging token: load-tests-go-flink-batch-combine-1-1013065314_c65eb0bd-381b-4f19-b7cd-3b820502673a
2022/10/13 08:43:50 Staged binary artifact with token:
2022/10/13 08:43:52 Submitted job: load0tests0go0flink0batch0combine0101013065314-root-1013084350-567065a6_6f0c1448-10c5-4011-886f-5a863d4727e7
2022/10/13 08:43:52 Job state: STOPPED
2022/10/13 08:43:52 Job state: STARTING
2022/10/13 08:43:52 Job state: RUNNING
2022/10/13 08:45:01 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/13 08:45:01 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/13 08:45:01 Job state: FAILED
2022/10/13 08:45:01 Failed to execute job: job load0tests0go0flink0batch0combine0101013065314-root-1013084350-567065a6_6f0c1448-10c5-4011-886f-5a863d4727e7 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101013065314-root-1013084350-567065a6_6f0c1448-10c5-4011-886f-5a863d4727e7 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x16291a8, 0xc00012e000}, {0x148bd11?, 0x1fa3200?}, {0xc000555e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 38s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/ermfdk533kgqg
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #678
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/678/display/redirect?page=changes>
Changes:
[git] BEAM-13592 Add getOrderingKey in
[git] Add CHANGES entry
[git] Rename transform name according to review comment
[git] Update to pass ordering key
[egalpin] Adds ordering key to OutgoingMessage builder, adds new coders to pubsub
[egalpin] Fixes pubsub bounded writer allowing for orderingKey
[egalpin] Alters order of pubsub message support in registrar
[egalpin] Removed publishTime and messageId in grpc pubsub client publish
[egalpin] Attempts to allow different pubsub root url for PubsubIO.Write
[egalpin] Fixes pubsub tests root url
[egalpin] Puts PubsubMessageCoder last in registrar
[egalpin] Uses MoreObjects over Objects
[egalpin] Renames PubsubMessageCoder to
[Robert Bradshaw] Add a multi-process shared utility.
[Robert Bradshaw] Add fastener dependency.
[Robert Bradshaw] Refactor to have an explicit acquire/release API.
[Robert Bradshaw] Drop a TODO about deferred construction parameterization.
[Robert Bradshaw] Fix unused import/var.
[Moritz Mack] Replace website references to deprecated aws / kinesis modules with more
[noreply] fix distribution example in golang guide (#23567)
[noreply] Add database role to SpannerConfig for role-based access control.
[noreply] Remove obsolete and deprecated bigquery native read. (#23557)
[noreply] Feature/name all java threads (#23387)
[noreply] [Go SDK] Don't construct plans in lock section. (#23583)
[noreply] Remove obsolete and deprecated bigquery native write. #23557 (#23558)
[noreply] Increase Python PostCommit timeout. (#23595)
------------------------------------------
[...truncated 33.84 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/12 08:44:06 Using specified **** binary: 'linux_amd64/combine'
2022/10/12 08:44:06 Prepared job with id: load-tests-go-flink-batch-combine-1-1012065344_6b43b6fe-6250-40d7-ae8f-4a50a06151a1 and staging token: load-tests-go-flink-batch-combine-1-1012065344_6b43b6fe-6250-40d7-ae8f-4a50a06151a1
2022/10/12 08:44:10 Staged binary artifact with token:
2022/10/12 08:44:11 Submitted job: load0tests0go0flink0batch0combine0101012065344-root-1012084410-34b2df8f_e5c1dd86-3c9e-46c5-9f36-337a25e64ad7
2022/10/12 08:44:11 Job state: STOPPED
2022/10/12 08:44:11 Job state: STARTING
2022/10/12 08:44:11 Job state: RUNNING
2022/10/12 08:45:20 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/12 08:45:20 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/12 08:45:20 Job state: FAILED
2022/10/12 08:45:20 Failed to execute job: job load0tests0go0flink0batch0combine0101012065344-root-1012084410-34b2df8f_e5c1dd86-3c9e-46c5-9f36-337a25e64ad7 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101012065344-root-1012084410-34b2df8f_e5c1dd86-3c9e-46c5-9f36-337a25e64ad7 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1627848, 0xc00004a0c0}, {0x148a52b?, 0x1fa0120?}, {0xc0005d9e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 40s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/df6mx5f6dlk7y
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #677
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/677/display/redirect?page=changes>
Changes:
[bulat.safiullin] [Website] update python-dependencies.md link #23478
[bulat.safiullin] [Website] update styles of iframe with video #23499
[bulat.safiullin] [Website] add version.html to shortcodes, update jet.md 22985
[Moritz Mack] Downgrade Scala version in Spark job-server to prevent Scala
[noreply] Support named databases in Firestore connector. Fix and enable Firestore
[noreply] [fixes #23000] Update the Python SDK harness state cache to be a loading
[noreply] Fix permission for Build python wheel branch_repo_nightly step (#23563)
[noreply] [Playground] complexity indicator (#23477)
[noreply] Reolling forward property-based tests for coders (#23425)
[noreply] Updated README for jupyterlab-sidepanel
------------------------------------------
[...truncated 33.83 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/11 08:43:35 Using specified **** binary: 'linux_amd64/combine'
2022/10/11 08:43:35 Prepared job with id: load-tests-go-flink-batch-combine-1-1011065318_4b3163fe-5689-46e8-b940-6cbc02d0f202 and staging token: load-tests-go-flink-batch-combine-1-1011065318_4b3163fe-5689-46e8-b940-6cbc02d0f202
2022/10/11 08:43:39 Staged binary artifact with token:
2022/10/11 08:43:40 Submitted job: load0tests0go0flink0batch0combine0101011065318-root-1011084339-f225374a_467755f0-4c9f-4a7f-85ab-ea76de65372c
2022/10/11 08:43:40 Job state: STOPPED
2022/10/11 08:43:40 Job state: STARTING
2022/10/11 08:43:40 Job state: RUNNING
2022/10/11 08:44:49 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/11 08:44:49 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/11 08:44:50 Job state: FAILED
2022/10/11 08:44:50 Failed to execute job: job load0tests0go0flink0batch0combine0101011065318-root-1011084339-f225374a_467755f0-4c9f-4a7f-85ab-ea76de65372c failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101011065318-root-1011084339-f225374a_467755f0-4c9f-4a7f-85ab-ea76de65372c failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1627848, 0xc000136000}, {0x148a52b?, 0x1fa0120?}, {0xc0005cfe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/qvxxu3qjlng6g
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #676
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/676/display/redirect>
Changes:
------------------------------------------
[...truncated 33.74 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/10 08:43:23 Using specified **** binary: 'linux_amd64/combine'
2022/10/10 08:43:23 Prepared job with id: load-tests-go-flink-batch-combine-1-1010065313_1d245c59-fde6-4118-b05b-a09d185117f3 and staging token: load-tests-go-flink-batch-combine-1-1010065313_1d245c59-fde6-4118-b05b-a09d185117f3
2022/10/10 08:43:27 Staged binary artifact with token:
2022/10/10 08:43:28 Submitted job: load0tests0go0flink0batch0combine0101010065313-root-1010084327-32e8d14d_fc8ad8b9-8562-4716-91c3-427fb221adb6
2022/10/10 08:43:28 Job state: STOPPED
2022/10/10 08:43:28 Job state: STARTING
2022/10/10 08:43:28 Job state: RUNNING
2022/10/10 08:44:38 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/10 08:44:38 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/10 08:44:38 Job state: FAILED
2022/10/10 08:44:38 Failed to execute job: job load0tests0go0flink0batch0combine0101010065313-root-1010084327-32e8d14d_fc8ad8b9-8562-4716-91c3-427fb221adb6 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101010065313-root-1010084327-32e8d14d_fc8ad8b9-8562-4716-91c3-427fb221adb6 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1627848, 0xc00004a0c0}, {0x148a52b?, 0x1fa0120?}, {0xc00034be70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 40s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/wgtvh4d5r7sw2
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #675
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/675/display/redirect?page=changes>
Changes:
[noreply] Merge pull request #23547: update bom to the latest one.
------------------------------------------
[...truncated 33.71 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/09 08:43:36 Using specified **** binary: 'linux_amd64/combine'
2022/10/09 08:43:36 Prepared job with id: load-tests-go-flink-batch-combine-1-1009065317_39f4b26b-8c32-44b3-af33-5792a6d76b49 and staging token: load-tests-go-flink-batch-combine-1-1009065317_39f4b26b-8c32-44b3-af33-5792a6d76b49
2022/10/09 08:43:40 Staged binary artifact with token:
2022/10/09 08:43:41 Submitted job: load0tests0go0flink0batch0combine0101009065317-root-1009084340-6a2c789b_2f7650de-067a-4864-bf15-50f6cc82be74
2022/10/09 08:43:41 Job state: STOPPED
2022/10/09 08:43:41 Job state: STARTING
2022/10/09 08:43:41 Job state: RUNNING
2022/10/09 08:44:50 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/09 08:44:50 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/09 08:44:50 Job state: FAILED
2022/10/09 08:44:50 Failed to execute job: job load0tests0go0flink0batch0combine0101009065317-root-1009084340-6a2c789b_2f7650de-067a-4864-bf15-50f6cc82be74 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101009065317-root-1009084340-6a2c789b_2f7650de-067a-4864-bf15-50f6cc82be74 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1627848, 0xc00012e000}, {0x148a52b?, 0x1fa0120?}, {0xc000307e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 40s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/5t4vqcujqnqj6
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #674
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/674/display/redirect?page=changes>
Changes:
[Moritz Mack] Correctly detect retryable TransientKinesisExceptions (fixes #23517)
[noreply] Bump actions/stale from 5 to 6 (#23331)
[noreply] Fix small error message typo
[noreply] Fixing right nav on Get Started page (#23543)
[noreply] Bump google.golang.org/grpc from 1.49.0 to 1.50.0 in /sdks (#23533)
------------------------------------------
[...truncated 33.83 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/08 08:44:01 Using specified **** binary: 'linux_amd64/combine'
2022/10/08 08:44:01 Prepared job with id: load-tests-go-flink-batch-combine-1-1008065328_e11edc5f-9925-48c2-b16c-6db41f129360 and staging token: load-tests-go-flink-batch-combine-1-1008065328_e11edc5f-9925-48c2-b16c-6db41f129360
2022/10/08 08:44:06 Staged binary artifact with token:
2022/10/08 08:44:07 Submitted job: load0tests0go0flink0batch0combine0101008065328-root-1008084406-26f41080_aa1f20a6-911d-413e-a700-85d3244a3ab8
2022/10/08 08:44:07 Job state: STOPPED
2022/10/08 08:44:07 Job state: STARTING
2022/10/08 08:44:07 Job state: RUNNING
2022/10/08 08:45:16 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/08 08:45:16 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/08 08:45:16 Job state: FAILED
2022/10/08 08:45:16 Failed to execute job: job load0tests0go0flink0batch0combine0101008065328-root-1008084406-26f41080_aa1f20a6-911d-413e-a700-85d3244a3ab8 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101008065328-root-1008084406-26f41080_aa1f20a6-911d-413e-a700-85d3244a3ab8 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1627848, 0xc000136000}, {0x148a52b?, 0x1fa0120?}, {0xc0005e1e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 45s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/gbttdugqro5fq
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #673
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/673/display/redirect?page=changes>
Changes:
[noreply] Fix broken link in online clustering documentation (#23516)
[toran.sahu] fix typo - s/befrehand/beforehand
[noreply] Grant actions using GITHUB_TOKEN the appropriate permission set (#23521)
[noreply] Fix failing Py37 BQ file loads test (#23334)
[noreply] [Website] update links to https (#23523)
[noreply] Support custom avro DatumReader when reading from BigQuery (#22718)
[noreply] Rename 'clean' Gradle task that required Flutter and has been breaking
[noreply] Model handler unit test (#23506)
[noreply] Content/multi model pipelines (#23498)
[noreply] [Tour of Beam][Frontend] Content Tree and SDK models (#23316) (#23417)
[noreply] Fix bug where `astype(CategoricalDtype)` is rejected (#23513)
------------------------------------------
[...truncated 33.79 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/07 08:43:40 Using specified **** binary: 'linux_amd64/combine'
2022/10/07 08:43:40 Prepared job with id: load-tests-go-flink-batch-combine-1-1007065327_aaa94ced-de35-4778-abde-dfd7656efab9 and staging token: load-tests-go-flink-batch-combine-1-1007065327_aaa94ced-de35-4778-abde-dfd7656efab9
2022/10/07 08:43:45 Staged binary artifact with token:
2022/10/07 08:43:46 Submitted job: load0tests0go0flink0batch0combine0101007065327-root-1007084345-1a40aef1_a5a75b71-a591-4ae7-91f5-e3542038034e
2022/10/07 08:43:46 Job state: STOPPED
2022/10/07 08:43:46 Job state: STARTING
2022/10/07 08:43:46 Job state: RUNNING
2022/10/07 08:44:55 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/07 08:44:55 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/07 08:44:55 Job state: FAILED
2022/10/07 08:44:55 Failed to execute job: job load0tests0go0flink0batch0combine0101007065327-root-1007084345-1a40aef1_a5a75b71-a591-4ae7-91f5-e3542038034e failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101007065327-root-1007084345-1a40aef1_a5a75b71-a591-4ae7-91f5-e3542038034e failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x16275a8, 0xc000136000}, {0x148a3ab?, 0x1fa0100?}, {0xc0000cfe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 39s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/6kqzearwssrf2
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #672
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/672/display/redirect?page=changes>
Changes:
[Moritz Mack] Ensure Java JMH benchmark tasks run sequentially to prevent failure when
[Moritz Mack] Fix validation of measurement name in InfluxDBPublisher (addresses
[noreply] group_id (#23445)
[noreply] Give issue tagger permission to write issues (#23485)
[noreply] Update UID (#23486)
[noreply] Improve error message in GcsUtil (#23482)
[noreply] Add more typescript examples to the programming guide. (#23058)
[noreply] Merge pull request #23505: opt in for schema update. addresses #23504
[noreply] fix: only report backlog bytes on data records (#23493)
------------------------------------------
[...truncated 33.77 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/06 08:43:46 Using specified **** binary: 'linux_amd64/combine'
2022/10/06 08:43:46 Prepared job with id: load-tests-go-flink-batch-combine-1-1006065327_760be38e-8236-4844-a524-9b5b5e3ae539 and staging token: load-tests-go-flink-batch-combine-1-1006065327_760be38e-8236-4844-a524-9b5b5e3ae539
2022/10/06 08:43:50 Staged binary artifact with token:
2022/10/06 08:43:52 Submitted job: load0tests0go0flink0batch0combine0101006065327-root-1006084351-de7e1ab3_50a41956-31de-4a1b-a14d-8628ef6d80cd
2022/10/06 08:43:52 Job state: STOPPED
2022/10/06 08:43:52 Job state: STARTING
2022/10/06 08:43:52 Job state: RUNNING
2022/10/06 08:45:01 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/06 08:45:01 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/06 08:45:01 Job state: FAILED
2022/10/06 08:45:01 Failed to execute job: job load0tests0go0flink0batch0combine0101006065327-root-1006084351-de7e1ab3_50a41956-31de-4a1b-a14d-8628ef6d80cd failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101006065327-root-1006084351-de7e1ab3_50a41956-31de-4a1b-a14d-8628ef6d80cd failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x16275a8, 0xc00004a0c0}, {0x148a3ab?, 0x1fa0100?}, {0xc000799e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 39s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/d2grex2j4lsqq
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #671
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/671/display/redirect?page=changes>
Changes:
[bulat.safiullin] [Website] fix navbar footer overlap #22698
[ningkang0957] Prep sidepanel 3.0.0 release
[noreply] AI/ML pipelines master page documentation (#23443)
[noreply] Fix go fmt error (#23474)
[noreply] Revert "Add drop_example flag to the RunInference and Model Handler
[noreply] Documented supported PyTorch versions (#22974)
[noreply] [Go SDK] Add fake impulse for inputs in Xlang Transform (#23383)
[noreply] Write permissions for issue closer/assigner
[noreply] GA Migration Adding Removal of /.m2/settings.xml (#23481)
[noreply] Bump google-cloud-spanner version for py containers (#23480)
------------------------------------------
[...truncated 33.74 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/05 08:43:42 Using specified **** binary: 'linux_amd64/combine'
2022/10/05 08:43:43 Prepared job with id: load-tests-go-flink-batch-combine-1-1005065328_9fe0cdeb-3798-4734-8e91-a2397e62f1bc and staging token: load-tests-go-flink-batch-combine-1-1005065328_9fe0cdeb-3798-4734-8e91-a2397e62f1bc
2022/10/05 08:43:47 Staged binary artifact with token:
2022/10/05 08:43:48 Submitted job: load0tests0go0flink0batch0combine0101005065328-root-1005084347-d126e58a_583c552a-64ad-4522-abd5-be677bcd55ad
2022/10/05 08:43:48 Job state: STOPPED
2022/10/05 08:43:48 Job state: STARTING
2022/10/05 08:43:48 Job state: RUNNING
2022/10/05 08:44:57 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/05 08:44:57 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/05 08:44:57 Job state: FAILED
2022/10/05 08:44:57 Failed to execute job: job load0tests0go0flink0batch0combine0101005065328-root-1005084347-d126e58a_583c552a-64ad-4522-abd5-be677bcd55ad failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101005065328-root-1005084347-d126e58a_583c552a-64ad-4522-abd5-be677bcd55ad failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x16275a8, 0xc000134000}, {0x148a3ab?, 0x1fa0100?}, {0xc0000e9e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 44s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/43rhmvghspyle
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #670
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/670/display/redirect?page=changes>
Changes:
[noreply] [Tour Of Beam] return taskSnippetId/solutionSnippedId (#23419)
[noreply] Beam 21465 add requires stable input (#23230)
[noreply] [Website] Add new Java quickstart (#22747)
[Robert Bradshaw] Require time-bound flag for non-UW streaming Python jobs for new SDKs.
[noreply] Fix JdbcIOIT, which seems to have never worked (#21796)
[noreply] Support DECIMAL logical type in python SDK (#23014)
------------------------------------------
[...truncated 33.73 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/04 08:43:55 Using specified **** binary: 'linux_amd64/combine'
2022/10/04 08:43:55 Prepared job with id: load-tests-go-flink-batch-combine-1-1004065318_fcd9d9c9-80b5-48ac-a899-9e1419a4fe8d and staging token: load-tests-go-flink-batch-combine-1-1004065318_fcd9d9c9-80b5-48ac-a899-9e1419a4fe8d
2022/10/04 08:44:00 Staged binary artifact with token:
2022/10/04 08:44:01 Submitted job: load0tests0go0flink0batch0combine0101004065318-root-1004084400-b6d1c92c_7f5268f9-403b-4791-b394-900535af5bb6
2022/10/04 08:44:01 Job state: STOPPED
2022/10/04 08:44:01 Job state: STARTING
2022/10/04 08:44:01 Job state: RUNNING
2022/10/04 08:45:11 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/04 08:45:11 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/04 08:45:11 Job state: FAILED
2022/10/04 08:45:11 Failed to execute job: job load0tests0go0flink0batch0combine0101004065318-root-1004084400-b6d1c92c_7f5268f9-403b-4791-b394-900535af5bb6 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101004065318-root-1004084400-b6d1c92c_7f5268f9-403b-4791-b394-900535af5bb6 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x16275a8, 0xc00004a0c0}, {0x148a3ab?, 0x1fa0100?}, {0xc000153e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 35s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/2nxnanllwuzta
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #669
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/669/display/redirect>
Changes:
------------------------------------------
[...truncated 33.82 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/03 08:43:55 Using specified **** binary: 'linux_amd64/combine'
2022/10/03 08:43:56 Prepared job with id: load-tests-go-flink-batch-combine-1-1003065323_87d6c521-c308-42b9-8e62-6bb2dbb50132 and staging token: load-tests-go-flink-batch-combine-1-1003065323_87d6c521-c308-42b9-8e62-6bb2dbb50132
2022/10/03 08:44:00 Staged binary artifact with token:
2022/10/03 08:44:01 Submitted job: load0tests0go0flink0batch0combine0101003065323-root-1003084400-94bdc4e6_182a8d19-0df3-4451-a2b3-c036631bbbba
2022/10/03 08:44:01 Job state: STOPPED
2022/10/03 08:44:01 Job state: STARTING
2022/10/03 08:44:01 Job state: RUNNING
2022/10/03 08:45:10 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/03 08:45:10 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/03 08:45:10 Job state: FAILED
2022/10/03 08:45:10 Failed to execute job: job load0tests0go0flink0batch0combine0101003065323-root-1003084400-94bdc4e6_182a8d19-0df3-4451-a2b3-c036631bbbba failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101003065323-root-1003084400-94bdc4e6_182a8d19-0df3-4451-a2b3-c036631bbbba failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x16275a8, 0xc00012e000}, {0x148a3ab?, 0x1fa0100?}, {0xc000307e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 38s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/nvhmgu6wiqnbw
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #668
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/668/display/redirect?page=changes>
Changes:
[noreply] JdbcIO fetchSize can be set to Integer.MIN_VALUE (#23444)
------------------------------------------
[...truncated 33.76 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/02 08:43:34 Using specified **** binary: 'linux_amd64/combine'
2022/10/02 08:43:34 Prepared job with id: load-tests-go-flink-batch-combine-1-1002065324_fc7f9a96-6e4b-4064-8b6f-c71d348eb638 and staging token: load-tests-go-flink-batch-combine-1-1002065324_fc7f9a96-6e4b-4064-8b6f-c71d348eb638
2022/10/02 08:43:39 Staged binary artifact with token:
2022/10/02 08:43:40 Submitted job: load0tests0go0flink0batch0combine0101002065324-root-1002084339-e8edc5f1_ba5aac4f-5232-4163-a587-95a86a26dad1
2022/10/02 08:43:40 Job state: STOPPED
2022/10/02 08:43:40 Job state: STARTING
2022/10/02 08:43:40 Job state: RUNNING
2022/10/02 08:44:48 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/02 08:44:48 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/02 08:44:48 Job state: FAILED
2022/10/02 08:44:49 Failed to execute job: job load0tests0go0flink0batch0combine0101002065324-root-1002084339-e8edc5f1_ba5aac4f-5232-4163-a587-95a86a26dad1 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101002065324-root-1002084339-e8edc5f1_ba5aac4f-5232-4163-a587-95a86a26dad1 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x16275a8, 0xc00012e000}, {0x148a3ab?, 0x1fa0100?}, {0xc0000f5e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 39s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/si2dimgoygnkk
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #667
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/667/display/redirect?page=changes>
Changes:
[Robert Bradshaw] Batch encoding and decoding of schema data.
[Robert Bradshaw] Add microbenchmark for batch row encoding.
[Robert Bradshaw] Add batch testing for standard row coders.
[noreply] RunInference Benchmarks UI (#23426)
[noreply] Relax `pip` check in setup.py to allow installation via other package
[noreply] replaced tabs with spaces in readme file (#23446)
[noreply] [Playground] [Backend] Adding the tags field to the example response
[noreply] [Playground] [Backend] Edited the function for getting executable name
[noreply] Fix type inference for set/delete attr. (#23242)
[noreply] Support VR test including TestStream for Spark runner in streaming mode
[noreply] Add cron job to trigger Java JMH micro-benchmarks weekly (#23388)
------------------------------------------
[...truncated 33.84 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/10/01 08:43:50 Using specified **** binary: 'linux_amd64/combine'
2022/10/01 08:43:50 Prepared job with id: load-tests-go-flink-batch-combine-1-1001065327_87718010-e21f-4c26-a1ba-d4c270235fbb and staging token: load-tests-go-flink-batch-combine-1-1001065327_87718010-e21f-4c26-a1ba-d4c270235fbb
2022/10/01 08:43:54 Staged binary artifact with token:
2022/10/01 08:43:55 Submitted job: load0tests0go0flink0batch0combine0101001065327-root-1001084354-bb953412_24bc004b-c636-40f3-aa15-bdc16c5251c0
2022/10/01 08:43:56 Job state: STOPPED
2022/10/01 08:43:56 Job state: STARTING
2022/10/01 08:43:56 Job state: RUNNING
2022/10/01 08:45:04 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/10/01 08:45:04 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/10/01 08:45:04 Job state: FAILED
2022/10/01 08:45:05 Failed to execute job: job load0tests0go0flink0batch0combine0101001065327-root-1001084354-bb953412_24bc004b-c636-40f3-aa15-bdc16c5251c0 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0101001065327-root-1001084354-bb953412_24bc004b-c636-40f3-aa15-bdc16c5251c0 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x16275a8, 0xc00004a0c0}, {0x148a3ab?, 0x1fa0100?}, {0xc000673e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/ucprxbdzwfzdq
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #666
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/666/display/redirect?page=changes>
Changes:
[Robert Bradshaw] Python cross language docs.
[srohde] Add documentation link to the interactive environment
[noreply] Fix Small pytorch notebook bug fix (#23407)
[noreply] PubsubIO - Improve limit validations to consider attributes (#23023)
[noreply] Example of Online Clustering (#23289)
[noreply] Bump google.golang.org/api from 0.97.0 to 0.98.0 in /sdks (#23394)
[noreply] Increase Go Dataflow Postcommit timeout to 5h (#23423)
[noreply] [Playground] [Backend] Updating endpoints for playground examples
[noreply] Send JavaScript messages to Playground iframes when switching the
[noreply] [Playground] [Backend] Adding SDK to the example response (#22871)
[noreply] [Playground] [Backend] Removing the code related to the Cloud Storage
[noreply] [BEAM-10785] Change RowAsDictJsonCoder to not ensure ASCII while
[noreply] Update Python katas to latest version of EduTools and Beam 2.41 (#23180)
------------------------------------------
[...truncated 33.77 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/30 08:43:43 Using specified **** binary: 'linux_amd64/combine'
2022/09/30 08:43:44 Prepared job with id: load-tests-go-flink-batch-combine-1-0930065953_66b6fc2b-d4de-4fa7-a707-33190ad95906 and staging token: load-tests-go-flink-batch-combine-1-0930065953_66b6fc2b-d4de-4fa7-a707-33190ad95906
2022/09/30 08:45:44 Staged binary artifact with token:
2022/09/30 08:45:46 Submitted job: load0tests0go0flink0batch0combine0100930065953-root-0930084545-5bc056bf_d75b6a09-8c90-44ad-b54c-6a2a544e57d1
2022/09/30 08:45:46 Job state: STOPPED
2022/09/30 08:45:46 Job state: STARTING
2022/09/30 08:45:46 Job state: RUNNING
2022/09/30 08:46:55 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/30 08:46:55 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/30 08:46:55 Job state: FAILED
2022/09/30 08:46:55 Failed to execute job: job load0tests0go0flink0batch0combine0100930065953-root-0930084545-5bc056bf_d75b6a09-8c90-44ad-b54c-6a2a544e57d1 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100930065953-root-0930084545-5bc056bf_d75b6a09-8c90-44ad-b54c-6a2a544e57d1 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x16275a8, 0xc00012e000}, {0x148a3ab?, 0x1fa0100?}, {0xc000699e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 3m 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/olk7rnolvcbzo
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #665
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/665/display/redirect?page=changes>
Changes:
[noreply] [Tour Of Beam] API adjustments (#23349)
[noreply] Adds support in Samza Runner to run DoFn.processElement in parallel
[noreply] Regenerate Go Protos (#23408)
[noreply] Support google-cloud-spanner v3 and fixes broken unit tests (#23365)
[noreply] Add relevant docs to Cloud Profiler exceptions. (#23404)
[noreply] Update state cache to not fail when measuring object sizes. (#23391)
------------------------------------------
[...truncated 33.70 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/29 08:45:18 Using specified **** binary: 'linux_amd64/combine'
2022/09/29 08:45:19 Prepared job with id: load-tests-go-flink-batch-combine-1-0929065327_726bade2-cf47-40cf-b215-43e1aac9f328 and staging token: load-tests-go-flink-batch-combine-1-0929065327_726bade2-cf47-40cf-b215-43e1aac9f328
2022/09/29 08:45:23 Staged binary artifact with token:
2022/09/29 08:45:24 Submitted job: load0tests0go0flink0batch0combine0100929065327-root-0929084523-a4270614_5e113708-ed70-49aa-bcb8-b16633cc3c68
2022/09/29 08:45:24 Job state: STOPPED
2022/09/29 08:45:24 Job state: STARTING
2022/09/29 08:45:24 Job state: RUNNING
2022/09/29 08:46:33 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/29 08:46:33 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/29 08:46:33 Job state: FAILED
2022/09/29 08:46:33 Failed to execute job: job load0tests0go0flink0batch0combine0100929065327-root-0929084523-a4270614_5e113708-ed70-49aa-bcb8-b16633cc3c68 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100929065327-root-0929084523-a4270614_5e113708-ed70-49aa-bcb8-b16633cc3c68 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1627548, 0xc00012e000}, {0x148a34b?, 0x1fa0100?}, {0xc0000efe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 2m 38s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/m47p6xlgnkp7w
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #664
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/664/display/redirect?page=changes>
Changes:
[shaojwu] make identifier of Date&DateTime to be a public static field
[shaojwu] make identifier of Time to be a public static field
[noreply] Add a tensorflow example to the run_inference_basic notebook (#23173)
[noreply] RunInference Benchmarks UI (#23371)
[noreply] set upper bound on google-cloud-profiler (#23354)
[noreply] Add ISSUE#23071 to CHANGES.md (#23297)
[noreply] Pin objsize version to avoid regression in 0.6.0 (#23396)
------------------------------------------
[...truncated 33.71 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/28 08:43:31 Using specified **** binary: 'linux_amd64/combine'
2022/09/28 08:43:31 Prepared job with id: load-tests-go-flink-batch-combine-1-0928065323_09f9299d-c73d-490a-a356-3cb2e0e55704 and staging token: load-tests-go-flink-batch-combine-1-0928065323_09f9299d-c73d-490a-a356-3cb2e0e55704
2022/09/28 08:43:35 Staged binary artifact with token:
2022/09/28 08:43:36 Submitted job: load0tests0go0flink0batch0combine0100928065323-root-0928084335-b76a76ef_17b00d89-fad8-452d-b92a-a5cee1d21286
2022/09/28 08:43:36 Job state: STOPPED
2022/09/28 08:43:36 Job state: STARTING
2022/09/28 08:43:36 Job state: RUNNING
2022/09/28 08:44:45 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/28 08:44:45 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/28 08:44:45 Job state: FAILED
2022/09/28 08:44:45 Failed to execute job: job load0tests0go0flink0batch0combine0100928065323-root-0928084335-b76a76ef_17b00d89-fad8-452d-b92a-a5cee1d21286 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100928065323-root-0928084335-b76a76ef_17b00d89-fad8-452d-b92a-a5cee1d21286 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1625f28, 0xc000136000}, {0x1488e91?, 0x1f9ce98?}, {0xc000659e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 38s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/yhiah5wfhl7vs
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #663
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/663/display/redirect?page=changes>
Changes:
[noreply] Bump org.nosphere.apache.rat from 0.7.0 to 0.8.0 (#23330)
[ningkang0957] Upgraded Flink on Dataproc support from Interacitve Beam
[noreply] GA Migration PreCommit and PostCommit Tables in CI.md (#23372)
[noreply] Stack Trace Decoration for Beam Samza Runner (#23221)
[noreply] [#22478]: Add read_time support to Google Firestore connector (#22966)
[noreply] Changes CoGroupByKey typehint from List to Iterable (#22984)
[noreply] Fix TextSource incorrect handling in channels that return short reads.
------------------------------------------
[...truncated 33.88 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/27 08:43:59 Using specified **** binary: 'linux_amd64/combine'
2022/09/27 08:44:00 Prepared job with id: load-tests-go-flink-batch-combine-1-0927065319_621696c1-1f65-4a4c-a4d0-549b1ef5dc04 and staging token: load-tests-go-flink-batch-combine-1-0927065319_621696c1-1f65-4a4c-a4d0-549b1ef5dc04
2022/09/27 08:44:04 Staged binary artifact with token:
2022/09/27 08:44:05 Submitted job: load0tests0go0flink0batch0combine0100927065319-root-0927084404-edbaace8_62325ff8-4d3e-49d0-8bcf-9498893e1bba
2022/09/27 08:44:05 Job state: STOPPED
2022/09/27 08:44:05 Job state: STARTING
2022/09/27 08:44:05 Job state: RUNNING
2022/09/27 08:45:14 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/27 08:45:14 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/27 08:45:14 Job state: FAILED
2022/09/27 08:45:14 Failed to execute job: job load0tests0go0flink0batch0combine0100927065319-root-0927084404-edbaace8_62325ff8-4d3e-49d0-8bcf-9498893e1bba failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100927065319-root-0927084404-edbaace8_62325ff8-4d3e-49d0-8bcf-9498893e1bba failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1625f28, 0xc0001a6000}, {0x1488e91?, 0x1f9ce98?}, {0xc000387e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 36s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/y64x254hz4oqm
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #662
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/662/display/redirect?page=changes>
Changes:
[noreply] Bump Java FnApi Container version to beam-master-20220923 (#23352)
------------------------------------------
[...truncated 33.57 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/26 08:43:11 Using specified **** binary: 'linux_amd64/combine'
2022/09/26 08:43:12 Prepared job with id: load-tests-go-flink-batch-combine-1-0926065322_69d65245-a241-4dce-a342-c59adb206d55 and staging token: load-tests-go-flink-batch-combine-1-0926065322_69d65245-a241-4dce-a342-c59adb206d55
2022/09/26 08:43:16 Staged binary artifact with token:
2022/09/26 08:43:17 Submitted job: load0tests0go0flink0batch0combine0100926065322-root-0926084316-9cded081_5911e721-b40e-45a5-9bf9-83df97e7a688
2022/09/26 08:43:17 Job state: STOPPED
2022/09/26 08:43:17 Job state: STARTING
2022/09/26 08:43:17 Job state: RUNNING
2022/09/26 08:44:26 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/26 08:44:26 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/26 08:44:26 Job state: FAILED
2022/09/26 08:44:26 Failed to execute job: job load0tests0go0flink0batch0combine0100926065322-root-0926084316-9cded081_5911e721-b40e-45a5-9bf9-83df97e7a688 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100926065322-root-0926084316-9cded081_5911e721-b40e-45a5-9bf9-83df97e7a688 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1625f28, 0xc00004a0c0}, {0x1488e91?, 0x1f9ce98?}, {0xc0004e5e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 25s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/uzh6psgaeolge
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #661
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/661/display/redirect?page=changes>
Changes:
[noreply] Extract playground components (#23253)
------------------------------------------
[...truncated 33.80 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/25 08:43:39 Using specified **** binary: 'linux_amd64/combine'
2022/09/25 08:43:39 Prepared job with id: load-tests-go-flink-batch-combine-1-0925065312_5b47e299-40d2-477b-830e-ea0cbfbed739 and staging token: load-tests-go-flink-batch-combine-1-0925065312_5b47e299-40d2-477b-830e-ea0cbfbed739
2022/09/25 08:43:43 Staged binary artifact with token:
2022/09/25 08:43:44 Submitted job: load0tests0go0flink0batch0combine0100925065312-root-0925084344-73faaca5_79576299-159b-4624-a29e-e763408198c4
2022/09/25 08:43:44 Job state: STOPPED
2022/09/25 08:43:44 Job state: STARTING
2022/09/25 08:43:44 Job state: RUNNING
2022/09/25 08:44:53 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/25 08:44:53 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/25 08:44:53 Job state: FAILED
2022/09/25 08:44:53 Failed to execute job: job load0tests0go0flink0batch0combine0100925065312-root-0925084344-73faaca5_79576299-159b-4624-a29e-e763408198c4 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100925065312-root-0925084344-73faaca5_79576299-159b-4624-a29e-e763408198c4 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1625f28, 0xc00012e000}, {0x1488e91?, 0x1f9ce98?}, {0xc0004e7e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 35s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/4luklozo4itcw
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #660
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/660/display/redirect?page=changes>
Changes:
[Moritz Mack] Fix Nexmark default log level
[noreply] Bump cloud.google.com/go/storage from 1.26.0 to 1.27.0 in /sdks (#23336)
[noreply] lint fixes to go (#23351)
[noreply] Bump cloud.google.com/go/bigquery from 1.41.0 to 1.42.0 in /sdks
------------------------------------------
[...truncated 33.81 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/24 08:44:03 Using specified **** binary: 'linux_amd64/combine'
2022/09/24 08:44:03 Prepared job with id: load-tests-go-flink-batch-combine-1-0924065318_c324cd1a-d950-406b-acbf-1de72be9d3dc and staging token: load-tests-go-flink-batch-combine-1-0924065318_c324cd1a-d950-406b-acbf-1de72be9d3dc
2022/09/24 08:44:07 Staged binary artifact with token:
2022/09/24 08:44:08 Submitted job: load0tests0go0flink0batch0combine0100924065318-root-0924084407-9deb4638_ed094568-7010-4f6d-b4ca-1aa5a9bf583e
2022/09/24 08:44:08 Job state: STOPPED
2022/09/24 08:44:08 Job state: STARTING
2022/09/24 08:44:08 Job state: RUNNING
2022/09/24 08:45:17 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/24 08:45:17 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/24 08:45:17 Job state: FAILED
2022/09/24 08:45:17 Failed to execute job: job load0tests0go0flink0batch0combine0100924065318-root-0924084407-9deb4638_ed094568-7010-4f6d-b4ca-1aa5a9bf583e failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100924065318-root-0924084407-9deb4638_ed094568-7010-4f6d-b4ca-1aa5a9bf583e failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1625f28, 0xc00012e000}, {0x1488e91?, 0x1f9ce98?}, {0xc0006f1e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 48s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/vqv3j2kur7wcw
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #659
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/659/display/redirect?page=changes>
Changes:
[Steve Niemitz] use avro DataFileReader to read avro container files
[bvolpato] Do not use .get() on ValueProvider during pipeline creation
[noreply] Improved pipeline translation in SparkStructuredStreamingRunner (#22446)
[noreply] Change google_cloud_bigdataoss_version to 2.2.8. (#23300)
------------------------------------------
[...truncated 33.85 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/23 08:43:53 Using specified **** binary: 'linux_amd64/combine'
2022/09/23 08:43:53 Prepared job with id: load-tests-go-flink-batch-combine-1-0923065324_03236fa6-26af-47b9-9b91-85ae867e8e40 and staging token: load-tests-go-flink-batch-combine-1-0923065324_03236fa6-26af-47b9-9b91-85ae867e8e40
2022/09/23 08:43:58 Staged binary artifact with token:
2022/09/23 08:43:59 Submitted job: load0tests0go0flink0batch0combine0100923065324-root-0923084358-5ee32053_d1e15062-4ccb-478f-a3f5-ca6a68ac80e9
2022/09/23 08:43:59 Job state: STOPPED
2022/09/23 08:43:59 Job state: STARTING
2022/09/23 08:43:59 Job state: RUNNING
2022/09/23 08:45:08 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/23 08:45:08 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/23 08:45:08 Job state: FAILED
2022/09/23 08:45:08 Failed to execute job: job load0tests0go0flink0batch0combine0100923065324-root-0923084358-5ee32053_d1e15062-4ccb-478f-a3f5-ca6a68ac80e9 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100923065324-root-0923084358-5ee32053_d1e15062-4ccb-478f-a3f5-ca6a68ac80e9 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1624a08, 0xc00012e000}, {0x1487b9c?, 0x1f9be98?}, {0xc000243e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 38s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/iep66t5nm56gs
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #658
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/658/display/redirect?page=changes>
Changes:
[noreply] [Java SDK core] emit watermark from PeriodicSequence (#23301) (#23302)
[noreply] Extend protocol in windmill.proto used by google-cloud-dataflow-java
[noreply] Allow longer Class-Path entries (#23269)
------------------------------------------
[...truncated 33.74 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/22 08:43:32 Using specified **** binary: 'linux_amd64/combine'
2022/09/22 08:43:33 Prepared job with id: load-tests-go-flink-batch-combine-1-0922065314_70befa6d-c47e-40d8-8f5c-65931928b5d1 and staging token: load-tests-go-flink-batch-combine-1-0922065314_70befa6d-c47e-40d8-8f5c-65931928b5d1
2022/09/22 08:43:37 Staged binary artifact with token:
2022/09/22 08:43:38 Submitted job: load0tests0go0flink0batch0combine0100922065314-root-0922084337-e3422d6d_6b04f335-13e4-4007-b602-3cc92dcf4380
2022/09/22 08:43:38 Job state: STOPPED
2022/09/22 08:43:38 Job state: STARTING
2022/09/22 08:43:38 Job state: RUNNING
2022/09/22 08:44:47 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/22 08:44:47 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/22 08:44:47 Job state: FAILED
2022/09/22 08:44:47 Failed to execute job: job load0tests0go0flink0batch0combine0100922065314-root-0922084337-e3422d6d_6b04f335-13e4-4007-b602-3cc92dcf4380 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100922065314-root-0922084337-e3422d6d_6b04f335-13e4-4007-b602-3cc92dcf4380 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1624a08, 0xc00004a0c0}, {0x1487b9c?, 0x1f9be98?}, {0xc000329e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 39s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/j66iqtxo4lb4q
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #657
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/657/display/redirect>
Changes:
------------------------------------------
[...truncated 33.83 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/21 08:43:48 Using specified **** binary: 'linux_amd64/combine'
2022/09/21 08:43:49 Prepared job with id: load-tests-go-flink-batch-combine-1-0921065329_3c7f4ed9-1dd8-403d-a100-4b2a9eb79240 and staging token: load-tests-go-flink-batch-combine-1-0921065329_3c7f4ed9-1dd8-403d-a100-4b2a9eb79240
2022/09/21 08:43:53 Staged binary artifact with token:
2022/09/21 08:43:54 Submitted job: load0tests0go0flink0batch0combine0100921065329-root-0921084353-ce3f110a_55db1ca7-2171-4b28-8eb4-6a508d1cc84a
2022/09/21 08:43:54 Job state: STOPPED
2022/09/21 08:43:54 Job state: STARTING
2022/09/21 08:43:54 Job state: RUNNING
2022/09/21 08:45:03 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/21 08:45:03 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/21 08:45:03 Job state: FAILED
2022/09/21 08:45:03 Failed to execute job: job load0tests0go0flink0batch0combine0100921065329-root-0921084353-ce3f110a_55db1ca7-2171-4b28-8eb4-6a508d1cc84a failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100921065329-root-0921084353-ce3f110a_55db1ca7-2171-4b28-8eb4-6a508d1cc84a failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1624a08, 0xc00012e000}, {0x1487b9c?, 0x1f9be98?}, {0xc000265e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 36s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/teo5pfxfj3772
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #656
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/656/display/redirect?page=changes>
Changes:
[Pablo Estrada] Revert "Trying out property-based tests for Beam python coders (#22233)"
[noreply] Bump google.golang.org/api from 0.95.0 to 0.96.0 in /sdks (#23246)
[noreply] [Go SDK] Add timer coder support (#23222)
[noreply] Fix wrong comment (#23272)
[noreply] [Playground] [Backend] Cache component for playground examples (#22869)
[noreply] [BEAM-13416] Introduce Schema provider for AWS models and deprecate low
[noreply] [BEAM-14378] [CdapIO] SparkReceiverIO Read via SDF (#17828)
------------------------------------------
[...truncated 33.83 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/20 08:44:12 Using specified **** binary: 'linux_amd64/combine'
2022/09/20 08:44:12 Prepared job with id: load-tests-go-flink-batch-combine-1-0920065315_cfc1f5ad-33f5-446a-9564-2ad60caa9075 and staging token: load-tests-go-flink-batch-combine-1-0920065315_cfc1f5ad-33f5-446a-9564-2ad60caa9075
2022/09/20 08:44:17 Staged binary artifact with token:
2022/09/20 08:44:18 Submitted job: load0tests0go0flink0batch0combine0100920065315-root-0920084417-f48b83a9_513b4083-bec2-4ca0-a3f8-910ee9e1eefc
2022/09/20 08:44:18 Job state: STOPPED
2022/09/20 08:44:18 Job state: STARTING
2022/09/20 08:44:18 Job state: RUNNING
2022/09/20 08:45:27 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/20 08:45:27 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/20 08:45:27 Job state: FAILED
2022/09/20 08:45:27 Failed to execute job: job load0tests0go0flink0batch0combine0100920065315-root-0920084417-f48b83a9_513b4083-bec2-4ca0-a3f8-910ee9e1eefc failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100920065315-root-0920084417-f48b83a9_513b4083-bec2-4ca0-a3f8-910ee9e1eefc failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1624a08, 0xc0001a6000}, {0x1487b9c?, 0x1f9be98?}, {0xc000167e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 50s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/itvfojq4wkmp6
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #655
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/655/display/redirect?page=changes>
Changes:
[noreply] Enable verbose output for RAT Precommit (#23279)
------------------------------------------
[...truncated 33.75 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/19 08:43:39 Using specified **** binary: 'linux_amd64/combine'
2022/09/19 08:43:40 Prepared job with id: load-tests-go-flink-batch-combine-1-0919065314_0ba6c301-303b-4362-9dde-9d5b8ca06f46 and staging token: load-tests-go-flink-batch-combine-1-0919065314_0ba6c301-303b-4362-9dde-9d5b8ca06f46
2022/09/19 08:43:44 Staged binary artifact with token:
2022/09/19 08:43:46 Submitted job: load0tests0go0flink0batch0combine0100919065314-root-0919084344-a609277b_b5e228d5-cf44-4921-a751-9acba9e81016
2022/09/19 08:43:46 Job state: STOPPED
2022/09/19 08:43:46 Job state: STARTING
2022/09/19 08:43:46 Job state: RUNNING
2022/09/19 08:44:55 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/19 08:44:55 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/19 08:44:55 Job state: FAILED
2022/09/19 08:44:55 Failed to execute job: job load0tests0go0flink0batch0combine0100919065314-root-0919084344-a609277b_b5e228d5-cf44-4921-a751-9acba9e81016 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100919065314-root-0919084344-a609277b_b5e228d5-cf44-4921-a751-9acba9e81016 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1623248, 0xc00004a0c0}, {0x148692f?, 0x1f99e78?}, {0xc0006a9e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 36s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/auvh22usur5b4
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #654
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/654/display/redirect?page=changes>
Changes:
[noreply] updated the pydoc for running a custom model on Beam (#23218)
[noreply] Add drop_example flag to the RunInference and Model Handler (#23266)
------------------------------------------
[...truncated 33.96 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/18 08:44:08 Using specified **** binary: 'linux_amd64/combine'
2022/09/18 08:44:09 Prepared job with id: load-tests-go-flink-batch-combine-1-0918065319_c61826e8-1920-4034-b778-48970bdc6b92 and staging token: load-tests-go-flink-batch-combine-1-0918065319_c61826e8-1920-4034-b778-48970bdc6b92
2022/09/18 08:44:13 Staged binary artifact with token:
2022/09/18 08:44:14 Submitted job: load0tests0go0flink0batch0combine0100918065319-root-0918084413-cfcca22b_531ec987-7fc3-40a4-92ce-fea2a646e3e3
2022/09/18 08:44:14 Job state: STOPPED
2022/09/18 08:44:14 Job state: STARTING
2022/09/18 08:44:14 Job state: RUNNING
2022/09/18 08:45:22 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/18 08:45:22 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/18 08:45:22 Job state: FAILED
2022/09/18 08:45:22 Failed to execute job: job load0tests0go0flink0batch0combine0100918065319-root-0918084413-cfcca22b_531ec987-7fc3-40a4-92ce-fea2a646e3e3 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100918065319-root-0918084413-cfcca22b_531ec987-7fc3-40a4-92ce-fea2a646e3e3 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1623248, 0xc00004a0c0}, {0x148692f?, 0x1f99e78?}, {0xc000253e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 38s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/xxzoq4pmrdgmg
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #653
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/653/display/redirect?page=changes>
Changes:
[noreply] Bump cloud.google.com/go/bigquery from 1.40.0 to 1.41.0 in /sdks
[noreply] [Website] Correct spelling of structural (#23225)
[noreply] TensorRT Initial commit (#22131)
[noreply] Fix Kafka performance test sourceOption to match expected hash (#23274)
------------------------------------------
[...truncated 33.84 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/17 08:43:50 Using specified **** binary: 'linux_amd64/combine'
2022/09/17 08:43:51 Prepared job with id: load-tests-go-flink-batch-combine-1-0917065323_916a6d78-ce69-4ed3-8864-d302c68daa4a and staging token: load-tests-go-flink-batch-combine-1-0917065323_916a6d78-ce69-4ed3-8864-d302c68daa4a
2022/09/17 08:43:56 Staged binary artifact with token:
2022/09/17 08:43:57 Submitted job: load0tests0go0flink0batch0combine0100917065323-root-0917084356-c29742ad_e8a566d0-58ed-4220-a171-c2434aa42a22
2022/09/17 08:43:57 Job state: STOPPED
2022/09/17 08:43:57 Job state: STARTING
2022/09/17 08:43:57 Job state: RUNNING
2022/09/17 08:45:06 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/17 08:45:06 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/17 08:45:06 Job state: FAILED
2022/09/17 08:45:06 Failed to execute job: job load0tests0go0flink0batch0combine0100917065323-root-0917084356-c29742ad_e8a566d0-58ed-4220-a171-c2434aa42a22 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100917065323-root-0917084356-c29742ad_e8a566d0-58ed-4220-a171-c2434aa42a22 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1623248, 0xc00012e000}, {0x148692f?, 0x1f99e78?}, {0xc00062de70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 45s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/jtmhtnjor73mg
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #652
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/652/display/redirect?page=changes>
Changes:
[bulat.safiullin] [Website] update site navigation #22902
[noreply] Pass namespace through RunInference transform (#23182)
[noreply] [GitHub Actions] - INFRA scripts to implement GCP Self-hosted runners
[noreply] GA migration - Base actions to use for precommit and postcommit
[noreply] Test fix Kafka Performance test batch (#23191)
[noreply] Revert "Exclude protobuf 3.20.2" (#23237)
[noreply] Fix outdated code in python sdk install (#23231)
[noreply] Bump up dataflow python container version to beam-master-20220914
[noreply] Improve the performance of TextSource by reducing how many byte[]s are
[noreply] Issue#21430 Avoid pruning DataframeTransforms (#23069)
------------------------------------------
[...truncated 33.68 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/16 08:43:40 Using specified **** binary: 'linux_amd64/combine'
2022/09/16 08:43:40 Prepared job with id: load-tests-go-flink-batch-combine-1-0916065321_cc292168-93f3-4901-b4f8-0a78baf56032 and staging token: load-tests-go-flink-batch-combine-1-0916065321_cc292168-93f3-4901-b4f8-0a78baf56032
2022/09/16 08:43:44 Staged binary artifact with token:
2022/09/16 08:43:46 Submitted job: load0tests0go0flink0batch0combine0100916065321-root-0916084345-c811197d_cb4a6999-bf4c-45ca-8e43-df8c3dd2b839
2022/09/16 08:43:46 Job state: STOPPED
2022/09/16 08:43:46 Job state: STARTING
2022/09/16 08:43:46 Job state: RUNNING
2022/09/16 08:44:54 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/16 08:44:54 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/16 08:44:54 Job state: FAILED
2022/09/16 08:44:54 Failed to execute job: job load0tests0go0flink0batch0combine0100916065321-root-0916084345-c811197d_cb4a6999-bf4c-45ca-8e43-df8c3dd2b839 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100916065321-root-0916084345-c811197d_cb4a6999-bf4c-45ca-8e43-df8c3dd2b839 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1623248, 0xc0001a8000}, {0x148692f?, 0x1f99e78?}, {0xc0003dfe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 26s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/iqwawlcoixuey
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #651
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/651/display/redirect?page=changes>
Changes:
[noreply] Fix assignees check
[noreply] Exclude protobuf 3.20.2 (#23226)
[noreply] Fix IllegalStateException in StorageApiWriteUnshardedRecords error
[noreply] Update cibuildwheel (#23024)
[noreply] Add section to docs on resource hints/RunInference (#23215)
[noreply] (BQ Python) Perform job waits in finish_bundle to allow BQ streaming
[noreply] Update to newest version of CloudPickle. (#23223)
[noreply] Resolve script parsing error when changing from bash to sh. (#23199)
[noreply] Bump cloud.google.com/go/bigquery from 1.39.0 to 1.40.0 in /sdks
[noreply] Bump github.com/google/go-cmp from 0.5.8 to 0.5.9 in /sdks (#23123)
[noreply] Update google-cloud-bigquery requirement from <3,>=1.6.0 to >=1.6.0,<4
[noreply] Optimize varint reading and writing for small ints. (#23192)
------------------------------------------
[...truncated 33.86 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/15 08:43:53 Using specified **** binary: 'linux_amd64/combine'
2022/09/15 08:43:53 Prepared job with id: load-tests-go-flink-batch-combine-1-0915065325_c9da62e1-8748-4f71-8af6-bbc67dd1ab91 and staging token: load-tests-go-flink-batch-combine-1-0915065325_c9da62e1-8748-4f71-8af6-bbc67dd1ab91
2022/09/15 08:43:57 Staged binary artifact with token:
2022/09/15 08:43:58 Submitted job: load0tests0go0flink0batch0combine0100915065325-root-0915084358-aa7005b9_4c2b224d-75f7-4b54-86ac-9d4e7dddd3f7
2022/09/15 08:43:58 Job state: STOPPED
2022/09/15 08:43:58 Job state: STARTING
2022/09/15 08:43:58 Job state: RUNNING
2022/09/15 08:45:07 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/15 08:45:07 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/15 08:45:07 Job state: FAILED
2022/09/15 08:45:07 Failed to execute job: job load0tests0go0flink0batch0combine0100915065325-root-0915084358-aa7005b9_4c2b224d-75f7-4b54-86ac-9d4e7dddd3f7 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100915065325-root-0915084358-aa7005b9_4c2b224d-75f7-4b54-86ac-9d4e7dddd3f7 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1623248, 0xc00012e000}, {0x148692f?, 0x1f99e78?}, {0xc000669e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 44s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/6555q477cttlm
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #650
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/650/display/redirect?page=changes>
Changes:
[Moritz Mack] Annotate stateful VR test in TestStreamTest with UsesStatefulParDo
[Moritz Mack] Properly close Spark (streaming) context if Pipeline translation fails
[noreply] [Playground] [Backend] Datastore queries and mappers to get precompiled
[noreply] Open Allow and test pyarrow 8.x and 9.x (#22997)
[noreply] (BQ Python) Pass project field from options or parameter when writing
[noreply] Update python-machine-learning.md (#23209)
[noreply] Pin the version of cloudpickle to 2.1.x (#23120)
[noreply] Add streaming test for Write API sink (#21903)
[noreply] [Go SDK] Proto changes for timer param (#23216)
[noreply] Bump github.com/testcontainers/testcontainers-go in /sdks (#23201)
[noreply] Update to objsize to 0.5.2 which is under BSD-3 license (fixes #23096)
[noreply] Exclude insignificant whitespace from cloud object (#23217)
[noreply] Trying out property-based tests for Beam python coders (#22233)
[noreply] Publish results of JMH benchmark runs (Java SDK) to InfluxDB (part of
------------------------------------------
[...truncated 33.89 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/14 08:44:14 Using specified **** binary: 'linux_amd64/combine'
2022/09/14 08:44:14 Prepared job with id: load-tests-go-flink-batch-combine-1-0914065318_182d23d2-8d82-477b-85c2-7a937b15f78a and staging token: load-tests-go-flink-batch-combine-1-0914065318_182d23d2-8d82-477b-85c2-7a937b15f78a
2022/09/14 08:44:19 Staged binary artifact with token:
2022/09/14 08:44:20 Submitted job: load0tests0go0flink0batch0combine0100914065318-root-0914084419-68302f5a_402d4099-0759-400d-95f7-c5b9dbae953c
2022/09/14 08:44:20 Job state: STOPPED
2022/09/14 08:44:20 Job state: STARTING
2022/09/14 08:44:20 Job state: RUNNING
2022/09/14 08:45:28 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/14 08:45:28 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/14 08:45:28 Job state: FAILED
2022/09/14 08:45:28 Failed to execute job: job load0tests0go0flink0batch0combine0100914065318-root-0914084419-68302f5a_402d4099-0759-400d-95f7-c5b9dbae953c failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100914065318-root-0914084419-68302f5a_402d4099-0759-400d-95f7-c5b9dbae953c failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1621068, 0xc00012e000}, {0x148492f?, 0x1f96e38?}, {0xc0005bfe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 48s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/vdbhl3u77q2na
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #649
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/649/display/redirect?page=changes>
Changes:
[noreply] pubsublite: Reduce commit logspam (#22762)
[noreply] Added documentation in ACTIONS.md file (#23159)
[noreply] Bump dataflow java fnapi container version to beam-master-20220830
[noreply] [Issue#23071] Fix AfterProcessingTime for Python to behave like Java
[noreply] Don't depend on java 11 docker container for go test (#23197)
------------------------------------------
[...truncated 33.77 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/13 08:43:38 Using specified **** binary: 'linux_amd64/combine'
2022/09/13 08:43:38 Prepared job with id: load-tests-go-flink-batch-combine-1-0913065322_d79197e9-7fa2-4f4b-bab8-17d50fb49432 and staging token: load-tests-go-flink-batch-combine-1-0913065322_d79197e9-7fa2-4f4b-bab8-17d50fb49432
2022/09/13 08:43:43 Staged binary artifact with token:
2022/09/13 08:43:44 Submitted job: load0tests0go0flink0batch0combine0100913065322-root-0913084343-30d8967a_ea534b56-3e73-4e3f-a587-adf350d63172
2022/09/13 08:43:44 Job state: STOPPED
2022/09/13 08:43:44 Job state: STARTING
2022/09/13 08:43:44 Job state: RUNNING
2022/09/13 08:44:53 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/13 08:44:53 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/13 08:44:53 Job state: FAILED
2022/09/13 08:44:53 Failed to execute job: job load0tests0go0flink0batch0combine0100913065322-root-0913084343-30d8967a_ea534b56-3e73-4e3f-a587-adf350d63172 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100913065322-root-0913084343-30d8967a_ea534b56-3e73-4e3f-a587-adf350d63172 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1621068, 0xc00004a0c0}, {0x1484922?, 0x1f96e18?}, {0xc000741e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 38s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/vrn7l5hxv2uus
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #648
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/648/display/redirect?page=changes>
Changes:
[noreply] [TPC-DS] Use common queries argument for Jenkins jobs (#23139)
------------------------------------------
[...truncated 33.82 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/12 08:43:30 Using specified **** binary: 'linux_amd64/combine'
2022/09/12 08:43:31 Prepared job with id: load-tests-go-flink-batch-combine-1-0912065312_a64e08aa-83f2-4c2b-9cdd-c548a09a0b03 and staging token: load-tests-go-flink-batch-combine-1-0912065312_a64e08aa-83f2-4c2b-9cdd-c548a09a0b03
2022/09/12 08:43:35 Staged binary artifact with token:
2022/09/12 08:43:36 Submitted job: load0tests0go0flink0batch0combine0100912065312-root-0912084335-ea111e14_9115a1cd-28f2-48e3-91fd-4fe174c03780
2022/09/12 08:43:36 Job state: STOPPED
2022/09/12 08:43:36 Job state: STARTING
2022/09/12 08:43:36 Job state: RUNNING
2022/09/12 08:44:45 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/12 08:44:45 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/12 08:44:45 Job state: FAILED
2022/09/12 08:44:45 Failed to execute job: job load0tests0go0flink0batch0combine0100912065312-root-0912084335-ea111e14_9115a1cd-28f2-48e3-91fd-4fe174c03780 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100912065312-root-0912084335-ea111e14_9115a1cd-28f2-48e3-91fd-4fe174c03780 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1621068, 0xc00004a0c0}, {0x1484922?, 0x1f96e18?}, {0xc00014be70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/hgdwb5iplblzu
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #647
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/647/display/redirect>
Changes:
------------------------------------------
[...truncated 33.66 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/11 08:43:16 Using specified **** binary: 'linux_amd64/combine'
2022/09/11 08:43:17 Prepared job with id: load-tests-go-flink-batch-combine-1-0911065317_a5c925d2-bf60-45f7-a05d-2c85ad1d56ac and staging token: load-tests-go-flink-batch-combine-1-0911065317_a5c925d2-bf60-45f7-a05d-2c85ad1d56ac
2022/09/11 08:43:21 Staged binary artifact with token:
2022/09/11 08:43:22 Submitted job: load0tests0go0flink0batch0combine0100911065317-root-0911084321-41d70266_eb995c6f-bb05-495d-adb3-57ef9367b3ab
2022/09/11 08:43:22 Job state: STOPPED
2022/09/11 08:43:22 Job state: STARTING
2022/09/11 08:43:22 Job state: RUNNING
2022/09/11 08:44:31 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/11 08:44:31 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/11 08:44:31 Job state: FAILED
2022/09/11 08:44:31 Failed to execute job: job load0tests0go0flink0batch0combine0100911065317-root-0911084321-41d70266_eb995c6f-bb05-495d-adb3-57ef9367b3ab failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100911065317-root-0911084321-41d70266_eb995c6f-bb05-495d-adb3-57ef9367b3ab failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1621068, 0xc00004a0c0}, {0x1484922?, 0x1f96e18?}, {0xc0006fde70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 32s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/hgk7r57twg5vm
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #646
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/646/display/redirect?page=changes>
Changes:
[bulat.safiullin] [Website] add paddings to pillars-item, change styles of footer logos
[bulat.safiullin] [Website] add table-container-wrapper #22896
[bulat.safiullin] [Website] update shortcode languages from duplicate go to typescript
[cushon] Use a ClassLoadingStrategy that is compatible with Java 17+
[noreply] [TPC-DS] Store metrics into BigQuery and InfluxDB (#22545)
[noreply] [Website] update case-studies logo images #22799 (#22793)
[noreply] [Website] change media-query max-width variable to ak-breakpoint-xl
[noreply] [Website] add overflow to code tags #22888 (#22427)
[noreply] Clean up Kafka Cluster and pubsub topic in rc validation script (#23021)
[noreply] Fix assertions in the Spanner IO IT tests (#23098)
[noreply] Use existing pickle_library flag in expansion service. (#23111)
[noreply] Assert pipeline results in performance tests (#23027)
[noreply] Consolidate Samza TranslationContext and PortableTranslationContext
[noreply] Improvements to SchemaTransform implementations for BQ and Kafka
------------------------------------------
[...truncated 33.72 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/10 08:43:20 Using specified **** binary: 'linux_amd64/combine'
2022/09/10 08:43:20 Prepared job with id: load-tests-go-flink-batch-combine-1-0910065318_4d4a7bc1-1183-43be-8d82-0d04ddefb153 and staging token: load-tests-go-flink-batch-combine-1-0910065318_4d4a7bc1-1183-43be-8d82-0d04ddefb153
2022/09/10 08:43:24 Staged binary artifact with token:
2022/09/10 08:43:25 Submitted job: load0tests0go0flink0batch0combine0100910065318-root-0910084324-9e189320_f9a93717-967f-460e-b10f-3ca7854b2657
2022/09/10 08:43:25 Job state: STOPPED
2022/09/10 08:43:25 Job state: STARTING
2022/09/10 08:43:25 Job state: RUNNING
2022/09/10 08:44:34 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/10 08:44:34 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/10 08:44:34 Job state: FAILED
2022/09/10 08:44:34 Failed to execute job: job load0tests0go0flink0batch0combine0100910065318-root-0910084324-9e189320_f9a93717-967f-460e-b10f-3ca7854b2657 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100910065318-root-0910084324-9e189320_f9a93717-967f-460e-b10f-3ca7854b2657 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1621068, 0xc0001a6000}, {0x1484922?, 0x1f96e18?}, {0xc00059fe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 38s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/wvir2a5dt2m7y
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #645
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/645/display/redirect?page=changes>
Changes:
[yathu] Decrease derby.locks.waitTimeout in jdbc unit test
[noreply] clean up comments and register functional DoFn in wordcount.go (#23057)
[noreply] [Tour Of Beam][backend] integration tests and GA workflow (#23032)
[noreply] Auto-cancel old unit test Actions Runs (#23095)
[noreply] Merge pull request #23092 Cross-language tests in github actions.
[noreply] Update CHANGES.md for 2.42.0 cut, and add 2.43.0 section (#23108)
[noreply] remove `"io/ioutil"` package (#23001)
[noreply] Add one NER example to use a spaCy model with RunInference (#23035)
[noreply] Bump google.golang.org/api from 0.94.0 to 0.95.0 in /sdks (#23062)
[noreply] Implement JsonUtils (#22771)
[noreply] Support models returning a dictionary of outputs (#23087)
------------------------------------------
[...truncated 33.74 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/09 08:43:31 Using specified **** binary: 'linux_amd64/combine'
2022/09/09 08:43:31 Prepared job with id: load-tests-go-flink-batch-combine-1-0909065319_78c98dcd-002e-4359-bc44-edae6cd33431 and staging token: load-tests-go-flink-batch-combine-1-0909065319_78c98dcd-002e-4359-bc44-edae6cd33431
2022/09/09 08:43:36 Staged binary artifact with token:
2022/09/09 08:43:37 Submitted job: load0tests0go0flink0batch0combine0100909065319-root-0909084336-ce9ca88f_7064ff69-16b2-4ae5-abc0-ecfbda7215e3
2022/09/09 08:43:37 Job state: STOPPED
2022/09/09 08:43:37 Job state: STARTING
2022/09/09 08:43:37 Job state: RUNNING
2022/09/09 08:44:46 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/09 08:44:46 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/09 08:44:46 Job state: FAILED
2022/09/09 08:44:46 Failed to execute job: job load0tests0go0flink0batch0combine0100909065319-root-0909084336-ce9ca88f_7064ff69-16b2-4ae5-abc0-ecfbda7215e3 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100909065319-root-0909084336-ce9ca88f_7064ff69-16b2-4ae5-abc0-ecfbda7215e3 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1621068, 0xc00012e000}, {0x1484922?, 0x1f96e18?}, {0xc0006ebe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 40s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/uuryxw3dnibzk
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #644
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/644/display/redirect?page=changes>
Changes:
[clementg] allow non-lts jvm version, fallback on java 11 for runner
[clementg] Add a stricter java version method
[clementg] fall back to the nearest lts version
[noreply] Bump github.com/lib/pq from 1.10.6 to 1.10.7 in /sdks (#23061)
[noreply] Allowing more flexible precision for TIMESTAMP, DATETIME fields in
[noreply] Reenable run-inference tests on windows (#23044)
[noreply] [BEAM-12164] Support new value capture types NEW_ROW NEW_VALUES for s…
[noreply] Fix example registration input arity (#23059)
[noreply] Clarify inference example docs (#23018)
[noreply] [Playground] [Backend] Datastore queries and mappers to get examples
[noreply] Keep stale action from closing issues (#23067)
[Robert Bradshaw] Use cloudpickle for Java Python transforms.
[noreply] Merge pull request #22996: [BEAM-11205] Update GCP Libraries BOM
[Robert Burke] Moving to 2.43.0-SNAPSHOT on master branch.
------------------------------------------
[...truncated 33.76 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/08 08:43:34 Using specified **** binary: 'linux_amd64/combine'
2022/09/08 08:43:35 Prepared job with id: load-tests-go-flink-batch-combine-1-0908065318_07921021-c93b-4720-99e9-10db940d0c3b and staging token: load-tests-go-flink-batch-combine-1-0908065318_07921021-c93b-4720-99e9-10db940d0c3b
2022/09/08 08:43:39 Staged binary artifact with token:
2022/09/08 08:43:40 Submitted job: load0tests0go0flink0batch0combine0100908065318-root-0908084339-d46bf23f_ee0fadea-00b6-4f88-9bf6-b5dc8f8a0ede
2022/09/08 08:43:40 Job state: STOPPED
2022/09/08 08:43:40 Job state: STARTING
2022/09/08 08:43:40 Job state: RUNNING
2022/09/08 08:44:49 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/08 08:44:49 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/08 08:44:49 Job state: FAILED
2022/09/08 08:44:49 Failed to execute job: job load0tests0go0flink0batch0combine0100908065318-root-0908084339-d46bf23f_ee0fadea-00b6-4f88-9bf6-b5dc8f8a0ede failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100908065318-root-0908084339-d46bf23f_ee0fadea-00b6-4f88-9bf6-b5dc8f8a0ede failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x16210a8, 0xc00004a0c0}, {0x1484922?, 0x1f96e18?}, {0xc000299e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 44s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/h5zlek5pj7wyw
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #643
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/643/display/redirect?page=changes>
Changes:
[Kenneth Knowles] Cosmetic checkstyle fix to TextRowCountEstimator
[Kenneth Knowles] Upgrade to Gradle 7.5.1
[Brian Hulette] Use typehints in benchmark utilities
[oleg.borisevich] fixing condition for db index creation
[noreply] Allow users to pass classloader to dynamically load JDBC drivers.
[noreply] Fix withCheckStopReadingFn to not cause the pipeline to crash (#22962)
[noreply] Inference benchmark tests (#21738)
[noreply] [Go SDK]: Add support for Google Cloud Profiler for pipelines (#22824)
[noreply] Listen to window messages to switch SDK and to load content (#22959)
[Robert Bradshaw] Allow expansion service to choose pickler.
[noreply] Disable singleIterate (#23042)
[Robert Bradshaw] Accept "default" as pickler library.
[Robert Bradshaw] Clarifying comment.
[Heejong Lee] [BEAM-22856] PythonService Beam version compatibility
[chamikaramj] Fixes RunInference test failure
------------------------------------------
[...truncated 33.81 KB...]
>
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/07 08:43:52 Using specified **** binary: 'linux_amd64/combine'
2022/09/07 08:43:52 Prepared job with id: load-tests-go-flink-batch-combine-1-0907065322_b35456ef-dfa6-4c90-9183-98a51b0f40e4 and staging token: load-tests-go-flink-batch-combine-1-0907065322_b35456ef-dfa6-4c90-9183-98a51b0f40e4
2022/09/07 08:43:56 Staged binary artifact with token:
2022/09/07 08:43:57 Submitted job: load0tests0go0flink0batch0combine0100907065322-root-0907084357-5b95fc73_17a5fc68-2b8d-4490-ad12-f08beba94d18
2022/09/07 08:43:57 Job state: STOPPED
2022/09/07 08:43:57 Job state: STARTING
2022/09/07 08:43:57 Job state: RUNNING
2022/09/07 08:45:06 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/07 08:45:06 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/07 08:45:06 Job state: FAILED
2022/09/07 08:45:06 Failed to execute job: job load0tests0go0flink0batch0combine0100907065322-root-0907084357-5b95fc73_17a5fc68-2b8d-4490-ad12-f08beba94d18 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100907065322-root-0907084357-5b95fc73_17a5fc68-2b8d-4490-ad12-f08beba94d18 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x16210a8, 0xc000136000}, {0x1484922?, 0x1f96e18?}, {0xc000623e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 46s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/niaz4lks4v4da
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #642
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/642/display/redirect?page=changes>
Changes:
[noreply] Revert "Remove subprocess.PIPE usage by using a temp file (#22654)"
------------------------------------------
[...truncated 33.78 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/06 08:43:27 Using specified **** binary: 'linux_amd64/combine'
2022/09/06 08:43:27 Prepared job with id: load-tests-go-flink-batch-combine-1-0906065314_552d73d0-564e-4df1-aace-ff18b81d2229 and staging token: load-tests-go-flink-batch-combine-1-0906065314_552d73d0-564e-4df1-aace-ff18b81d2229
2022/09/06 08:43:31 Staged binary artifact with token:
2022/09/06 08:43:32 Submitted job: load0tests0go0flink0batch0combine0100906065314-root-0906084331-55a5b882_7374d4ed-4cd6-4e04-b8fe-ab7f467a3871
2022/09/06 08:43:32 Job state: STOPPED
2022/09/06 08:43:32 Job state: STARTING
2022/09/06 08:43:32 Job state: RUNNING
2022/09/06 08:44:41 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/06 08:44:41 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/06 08:44:41 Job state: FAILED
2022/09/06 08:44:41 Failed to execute job: job load0tests0go0flink0batch0combine0100906065314-root-0906084331-55a5b882_7374d4ed-4cd6-4e04-b8fe-ab7f467a3871 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100906065314-root-0906084331-55a5b882_7374d4ed-4cd6-4e04-b8fe-ab7f467a3871 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ebd08, 0xc00012e000}, {0x14551cc?, 0x1f4c7b0?}, {0xc0004efe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 35s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/twamfxitrnyf2
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #641
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/641/display/redirect?page=changes>
Changes:
[noreply] Generalize interface of InfluxDBPublisher to support more use cases
------------------------------------------
[...truncated 33.82 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/05 08:43:34 Using specified **** binary: 'linux_amd64/combine'
2022/09/05 08:43:34 Prepared job with id: load-tests-go-flink-batch-combine-1-0905065310_0daae845-7a83-405d-bbb6-18de085cd3b7 and staging token: load-tests-go-flink-batch-combine-1-0905065310_0daae845-7a83-405d-bbb6-18de085cd3b7
2022/09/05 08:43:38 Staged binary artifact with token:
2022/09/05 08:43:40 Submitted job: load0tests0go0flink0batch0combine0100905065310-root-0905084338-72c3264f_ceddd84f-9cbe-408e-ad3a-433d54771f56
2022/09/05 08:43:40 Job state: STOPPED
2022/09/05 08:43:40 Job state: STARTING
2022/09/05 08:43:40 Job state: RUNNING
2022/09/05 08:44:49 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/05 08:44:49 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/05 08:44:49 Job state: FAILED
2022/09/05 08:44:49 Failed to execute job: job load0tests0go0flink0batch0combine0100905065310-root-0905084338-72c3264f_ceddd84f-9cbe-408e-ad3a-433d54771f56 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100905065310-root-0905084338-72c3264f_ceddd84f-9cbe-408e-ad3a-433d54771f56 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ebd08, 0xc00004a0c0}, {0x14551cc?, 0x1f4c7b0?}, {0xc000241e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 41s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/aeoufhrswwxsi
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #640
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/640/display/redirect?page=changes>
Changes:
[noreply] [#19857] Migrate to using a memory aware cache within the Python SDK
------------------------------------------
[...truncated 33.80 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/04 08:43:19 Using specified **** binary: 'linux_amd64/combine'
2022/09/04 08:43:19 Prepared job with id: load-tests-go-flink-batch-combine-1-0904065313_db12a9d2-dddb-4a15-9c09-e5fe814b9c07 and staging token: load-tests-go-flink-batch-combine-1-0904065313_db12a9d2-dddb-4a15-9c09-e5fe814b9c07
2022/09/04 08:43:23 Staged binary artifact with token:
2022/09/04 08:43:24 Submitted job: load0tests0go0flink0batch0combine0100904065313-root-0904084324-7115e068_4e1239d4-96f8-4e55-9804-e4bcb08638c9
2022/09/04 08:43:24 Job state: STOPPED
2022/09/04 08:43:24 Job state: STARTING
2022/09/04 08:43:24 Job state: RUNNING
2022/09/04 08:44:33 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/04 08:44:33 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/04 08:44:33 Job state: FAILED
2022/09/04 08:44:34 Failed to execute job: job load0tests0go0flink0batch0combine0100904065313-root-0904084324-7115e068_4e1239d4-96f8-4e55-9804-e4bcb08638c9 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100904065313-root-0904084324-7115e068_4e1239d4-96f8-4e55-9804-e4bcb08638c9 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ebd08, 0xc00012e000}, {0x14551cc?, 0x1f4c7b0?}, {0xc000611e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 35s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/7ntvgdi3jlspo
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #639
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/639/display/redirect?page=changes>
Changes:
[noreply] [Tour Of Beam][backend] get unit content (#22967)
[noreply] Allows to use databaseio with postgres driver (#22941)
[noreply] Bump cloud.google.com/go/storage from 1.25.0 to 1.26.0 in /sdks (#22954)
[noreply] [BEAM-22859] Allow the specification of extra packages for external
[noreply] [Tour of Beam]: Welcome Screen frontend layout (#22794)
[noreply] Remove redundant testEventTimeTimerSetWithinAllowedLateness sickbay
[noreply] Adding support for Beam Schema Rows with BQ DIRECT_READ (#22926)
[noreply] Add java Bigquery IO known issue to beam 2.40 release blogpost (#22611)
[noreply] Update playground_deploy_examples.yml
[noreply] Add run-inference component for autolabeling (#22971)
[noreply] [Playground] [Infrastructure] Deleting the Cloud Storage Client (#22722)
[noreply] Updates Java RunInference to infer Python dependencies when possible
[noreply] Adding TensorFlow support to the Machine Learning overview page (#22949)
------------------------------------------
[...truncated 33.96 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/03 08:44:09 Using specified **** binary: 'linux_amd64/combine'
2022/09/03 08:44:10 Prepared job with id: load-tests-go-flink-batch-combine-1-0903065308_9fcd7396-97d5-47f6-a28c-488193cce8ea and staging token: load-tests-go-flink-batch-combine-1-0903065308_9fcd7396-97d5-47f6-a28c-488193cce8ea
2022/09/03 08:44:14 Staged binary artifact with token:
2022/09/03 08:44:16 Submitted job: load0tests0go0flink0batch0combine0100903065308-root-0903084414-2f0aca6c_e332cc66-9771-4ed0-8b15-344156d5068a
2022/09/03 08:44:16 Job state: STOPPED
2022/09/03 08:44:16 Job state: STARTING
2022/09/03 08:44:16 Job state: RUNNING
2022/09/03 08:45:25 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/03 08:45:25 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/03 08:45:25 Job state: FAILED
2022/09/03 08:45:25 Failed to execute job: job load0tests0go0flink0batch0combine0100903065308-root-0903084414-2f0aca6c_e332cc66-9771-4ed0-8b15-344156d5068a failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100903065308-root-0903084414-2f0aca6c_e332cc66-9771-4ed0-8b15-344156d5068a failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ebd08, 0xc0001a6000}, {0x14551cc?, 0x1f4c7b0?}, {0xc0000fbe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 40s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/ix2atm7zkicos
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #638
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/638/display/redirect?page=changes>
Changes:
[noreply] Add some explanatory comments to the wordcount registration (#22989)
[noreply] Move Go examples under the cookbook directory to generic registration
[noreply] Improve BQ test utils to support JSON in a more simple manner (#22942)
[noreply] [fixes #22980] Migrate BeamFnLoggingClient to the new execution state
[Robert Bradshaw] Update proto generation script due to BEAM-13939.
[Robert Bradshaw] Regenerate typescript protos.
[noreply] Add initial read_gbq wrapper (#22616)
[noreply] Minor: Fix lint failure (#22998)
------------------------------------------
[...truncated 33.74 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/02 08:43:34 Using specified **** binary: 'linux_amd64/combine'
2022/09/02 08:43:34 Prepared job with id: load-tests-go-flink-batch-combine-1-0902065316_6156d4e3-6c52-40e7-bb3f-963e92ce2850 and staging token: load-tests-go-flink-batch-combine-1-0902065316_6156d4e3-6c52-40e7-bb3f-963e92ce2850
2022/09/02 08:43:39 Staged binary artifact with token:
2022/09/02 08:43:40 Submitted job: load0tests0go0flink0batch0combine0100902065316-root-0902084339-4c90da0e_f375f530-0a21-4844-b53b-2247e7312f8c
2022/09/02 08:43:40 Job state: STOPPED
2022/09/02 08:43:40 Job state: STARTING
2022/09/02 08:43:40 Job state: RUNNING
2022/09/02 08:44:49 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/02 08:44:49 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/02 08:44:50 Job state: FAILED
2022/09/02 08:44:50 Failed to execute job: job load0tests0go0flink0batch0combine0100902065316-root-0902084339-4c90da0e_f375f530-0a21-4844-b53b-2247e7312f8c failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100902065316-root-0902084339-4c90da0e_f375f530-0a21-4844-b53b-2247e7312f8c failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ebd68, 0xc00004a0c0}, {0x14550e9?, 0x1f4c7b0?}, {0xc0003bfe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 39s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/autbfvkdtyjkq
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #637
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/637/display/redirect?page=changes>
Changes:
[Brian Hulette] Extract utilities in dataframe.schemas
[Brian Hulette] Add pandas_type_compatibility with pandas BatchConverter implementations
[Brian Hulette] Use Batched DoFns at DataFrame API boundaries
[Brian Hulette] Move dtype conversion to pandas_type_compatibility
[Brian Hulette] Always register pandas BatchConverters
[Brian Hulette] Fix interactive runner tests
[Brian Hulette] Use pandas_type_compatibility BatchConverters for dataframe.schemas
[Brian Hulette] Skip test cases broken in pandas 1.1.x
[Brian Hulette] Address review comments
[Brian Hulette] yapf, typo in test
[noreply] Filter out unsupported state tests (#22963)
[noreply] Add ability to remove/clear map and set state (#22938)
[Brian Hulette] Add test to reproduce https://github.com/apache/beam/issues/22854
[Brian Hulette] Exercise row coder with nested optional struct
[Brian Hulette] Make RowTypeConstraint callable
[Brian Hulette] Add test to exercise RowTypeConstraint.__call__
[noreply] Fix gpu to cpu conversion with warning logs (#22795)
[noreply] Add Go stateful DoFns to CHANGES.md and fix linting violations (#22958)
[noreply] 22805: Upgrade Jackson version from 2.13.0 to 2.13.3 (#22806)
[noreply] Run cred rotation every month (#22977)
[noreply] [BEAM-12164] Synchronize access queue in ThroughputEstimator and
------------------------------------------
[...truncated 33.74 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/09/01 08:45:25 Using specified **** binary: 'linux_amd64/combine'
2022/09/01 08:45:25 Prepared job with id: load-tests-go-flink-batch-combine-1-0901065323_3de62d79-42d4-47d3-b3bf-155da7b4c002 and staging token: load-tests-go-flink-batch-combine-1-0901065323_3de62d79-42d4-47d3-b3bf-155da7b4c002
2022/09/01 08:45:29 Staged binary artifact with token:
2022/09/01 08:45:30 Submitted job: load0tests0go0flink0batch0combine0100901065323-root-0901084529-bb694993_2ce06e73-56b8-4d55-816d-c26c230fa456
2022/09/01 08:45:30 Job state: STOPPED
2022/09/01 08:45:30 Job state: STARTING
2022/09/01 08:45:30 Job state: RUNNING
2022/09/01 08:46:39 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/09/01 08:46:39 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/09/01 08:46:39 Job state: FAILED
2022/09/01 08:46:39 Failed to execute job: job load0tests0go0flink0batch0combine0100901065323-root-0901084529-bb694993_2ce06e73-56b8-4d55-816d-c26c230fa456 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100901065323-root-0901084529-bb694993_2ce06e73-56b8-4d55-816d-c26c230fa456 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ebd68, 0xc0001a8000}, {0x14550e9?, 0x1f4c7b0?}, {0xc000673e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 58s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/pqmj24ywohea2
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #636
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/636/display/redirect?page=changes>
Changes:
[noreply] Fix yaml duplicated mapping key (#22952)
[noreply] [Playground] [Infrastructure] Adding the Cloud Datastore client to save
[noreply] Fix jdbc date conversion offset 1 day (#22738)
[noreply] Set state integration test (#22935)
[noreply] Minor: Fix option_from_runner_api typehint (#22946)
------------------------------------------
[...truncated 33.81 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/31 08:44:02 Using specified **** binary: 'linux_amd64/combine'
2022/08/31 08:44:02 Prepared job with id: load-tests-go-flink-batch-combine-1-0831065310_b7fa85ac-323d-40ee-8dd6-37f149c39cef and staging token: load-tests-go-flink-batch-combine-1-0831065310_b7fa85ac-323d-40ee-8dd6-37f149c39cef
2022/08/31 08:44:06 Staged binary artifact with token:
2022/08/31 08:44:08 Submitted job: load0tests0go0flink0batch0combine0100831065310-root-0831084407-43ba7b36_83408807-a193-45da-8258-61c3778b17fa
2022/08/31 08:44:08 Job state: STOPPED
2022/08/31 08:44:08 Job state: STARTING
2022/08/31 08:44:08 Job state: RUNNING
2022/08/31 08:45:16 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/31 08:45:16 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/31 08:45:17 Job state: FAILED
2022/08/31 08:45:17 Failed to execute job: job load0tests0go0flink0batch0combine0100831065310-root-0831084407-43ba7b36_83408807-a193-45da-8258-61c3778b17fa failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100831065310-root-0831084407-43ba7b36_83408807-a193-45da-8258-61c3778b17fa failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ebce8, 0xc0001a6000}, {0x1455069?, 0x1f4c7b0?}, {0xc00071be70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 51s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/e4ovmiqnjwojy
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #635
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/635/display/redirect?page=changes>
Changes:
[yathu] Support Timestamp type in xlang JDBC Read and Write
[yathu] change urn name to millis_instant:v1
[yathu] Add standard_coders test
[yathu] Apply suggestions from code review
[yathu] Fix Java standard coder test
[yathu] Fix logical type with same language type gets completely hidden
[Robert Bradshaw] [BEAM-22923] Allow sharding specification for dataframe writes.
[noreply] Add set state in Go (#22919)
[noreply] Go Map State integration test (#22898)
[noreply] Add clear function for bag state types (#22917)
[noreply] [Playground] Update build_playground_backend.yml - add "Index creation"
[noreply] [Playground] [Backend] added SDK validation to save a code snippet
[noreply] Fix linting violations (#22934)
[noreply] [akvelon][tour-of-beam] backend bootstraps (#22556)
[noreply] Bump up postcommit timeout (#22937)
[noreply] Handle stateful windows correctly + integration test (#22918)
[noreply] Automatically infer state keys from their field name (#22922)
[noreply] Updates to multi-lang Java quickstart (#22927)
------------------------------------------
[...truncated 33.71 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/30 13:38:19 Using specified **** binary: 'linux_amd64/combine'
2022/08/30 13:38:20 Prepared job with id: load-tests-go-flink-batch-combine-1-0830112345_c6e370a2-7020-4519-9ea4-987dbc09f02e and staging token: load-tests-go-flink-batch-combine-1-0830112345_c6e370a2-7020-4519-9ea4-987dbc09f02e
2022/08/30 13:38:24 Staged binary artifact with token:
2022/08/30 13:38:25 Submitted job: load0tests0go0flink0batch0combine0100830112345-root-0830133824-4364e9d_02e494ab-5028-4a3f-8cc4-b3278a4491af
2022/08/30 13:38:25 Job state: STOPPED
2022/08/30 13:38:25 Job state: STARTING
2022/08/30 13:38:25 Job state: RUNNING
2022/08/30 13:39:34 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/30 13:39:34 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/30 13:39:34 Job state: FAILED
2022/08/30 13:39:34 Failed to execute job: job load0tests0go0flink0batch0combine0100830112345-root-0830133824-4364e9d_02e494ab-5028-4a3f-8cc4-b3278a4491af failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100830112345-root-0830133824-4364e9d_02e494ab-5028-4a3f-8cc4-b3278a4491af failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ebce8, 0xc00012e000}, {0x1455069?, 0x1f4c7b0?}, {0xc000297e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 48s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/337anad4yhuui
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #634
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/634/display/redirect>
Changes:
------------------------------------------
[...truncated 33.78 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/29 08:43:15 Using specified **** binary: 'linux_amd64/combine'
2022/08/29 08:43:15 Prepared job with id: load-tests-go-flink-batch-combine-1-0829065322_69554a1f-7290-4bef-be53-2a565db373cb and staging token: load-tests-go-flink-batch-combine-1-0829065322_69554a1f-7290-4bef-be53-2a565db373cb
2022/08/29 08:43:20 Staged binary artifact with token:
2022/08/29 08:43:20 Submitted job: load0tests0go0flink0batch0combine0100829065322-root-0829084320-d1151b0a_940e04d8-b429-4204-9d27-d74f82a836ed
2022/08/29 08:43:21 Job state: STOPPED
2022/08/29 08:43:21 Job state: STARTING
2022/08/29 08:43:21 Job state: RUNNING
2022/08/29 08:44:29 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/29 08:44:29 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/29 08:44:29 Job state: FAILED
2022/08/29 08:44:29 Failed to execute job: job load0tests0go0flink0batch0combine0100829065322-root-0829084320-d1151b0a_940e04d8-b429-4204-9d27-d74f82a836ed failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100829065322-root-0829084320-d1151b0a_940e04d8-b429-4204-9d27-d74f82a836ed failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15eaa68, 0xc00004a0c0}, {0x1454006?, 0x1f4a7b0?}, {0xc000237e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 30s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/kxenmpcl3jeyy
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #633
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/633/display/redirect>
Changes:
------------------------------------------
[...truncated 33.80 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/28 08:43:15 Using specified **** binary: 'linux_amd64/combine'
2022/08/28 08:43:15 Prepared job with id: load-tests-go-flink-batch-combine-1-0828065323_87be30d0-7b23-436a-b5c9-1a3c7c1ab559 and staging token: load-tests-go-flink-batch-combine-1-0828065323_87be30d0-7b23-436a-b5c9-1a3c7c1ab559
2022/08/28 08:43:19 Staged binary artifact with token:
2022/08/28 08:43:20 Submitted job: load0tests0go0flink0batch0combine0100828065323-root-0828084320-d5c2d411_71e5c239-53e6-49e2-a848-0bb54a451c30
2022/08/28 08:43:20 Job state: STOPPED
2022/08/28 08:43:20 Job state: STARTING
2022/08/28 08:43:20 Job state: RUNNING
2022/08/28 08:44:29 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/28 08:44:29 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/28 08:44:30 Job state: FAILED
2022/08/28 08:44:30 Failed to execute job: job load0tests0go0flink0batch0combine0100828065323-root-0828084320-d5c2d411_71e5c239-53e6-49e2-a848-0bb54a451c30 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100828065323-root-0828084320-d5c2d411_71e5c239-53e6-49e2-a848-0bb54a451c30 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15eaa68, 0xc00012e000}, {0x1454006?, 0x1f4a7b0?}, {0xc000307e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 31s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/slu2dypr4pke6
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #632
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/632/display/redirect?page=changes>
Changes:
[noreply] Pass user specified destination type to UpdateSchemaDestination
[noreply] [Go SDK] Stream decode values in single iterations (#22904)
[noreply] Enable autosharding for BQ: #22818
[noreply] Update wordcount_minimal.py by removing pipeline_args.extend (#22786)
[noreply] Add map state in the Go Sdk (#22897)
[noreply] [BEAM-12164] Feat: Added support to Cloud Spanner Change Streams
------------------------------------------
[...truncated 33.76 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/27 08:43:30 Using specified **** binary: 'linux_amd64/combine'
2022/08/27 08:43:30 Prepared job with id: load-tests-go-flink-batch-combine-1-0827065309_db91c839-d1b6-427e-bdae-17c4a386e8e3 and staging token: load-tests-go-flink-batch-combine-1-0827065309_db91c839-d1b6-427e-bdae-17c4a386e8e3
2022/08/27 08:43:35 Staged binary artifact with token:
2022/08/27 08:43:36 Submitted job: load0tests0go0flink0batch0combine0100827065309-root-0827084335-1470c23f_c7ef896c-a26d-4a59-8eec-115cf7fc9829
2022/08/27 08:43:36 Job state: STOPPED
2022/08/27 08:43:36 Job state: STARTING
2022/08/27 08:43:36 Job state: RUNNING
2022/08/27 08:44:45 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/27 08:44:45 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/27 08:44:46 Job state: FAILED
2022/08/27 08:44:46 Failed to execute job: job load0tests0go0flink0batch0combine0100827065309-root-0827084335-1470c23f_c7ef896c-a26d-4a59-8eec-115cf7fc9829 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100827065309-root-0827084335-1470c23f_c7ef896c-a26d-4a59-8eec-115cf7fc9829 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15eaa68, 0xc000136000}, {0x1454006?, 0x1f4a7b0?}, {0xc00029be70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 47s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/q3ouoggeova5k
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #631
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/631/display/redirect?page=changes>
Changes:
[Robert Bradshaw] [BEAM-22723] Yield BatchElement batches at end of window.
[noreply] Update sdks/python/apache_beam/transforms/util_test.py
[noreply] [Website] add Python to KinesisIO in connectors #22845 (#22841)
[noreply] Combining state integration test (#22846)
[cushon] Update to Byte Buddy 1.12.14
[cushon] Add a regression test
[cushon] Add spotless exclusion
[noreply] Small lint fixes (#22890)
[noreply] Preserve state on SDK switch (#22430) (#22735)
------------------------------------------
[...truncated 33.62 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/26 08:43:18 Using specified **** binary: 'linux_amd64/combine'
2022/08/26 08:43:18 Prepared job with id: load-tests-go-flink-batch-combine-1-0826065344_c971f7ce-5f05-4898-9aa7-2cf8311404dc and staging token: load-tests-go-flink-batch-combine-1-0826065344_c971f7ce-5f05-4898-9aa7-2cf8311404dc
2022/08/26 08:43:22 Staged binary artifact with token:
2022/08/26 08:43:23 Submitted job: load0tests0go0flink0batch0combine0100826065344-root-0826084322-81fefdc7_5a53caa0-1bb7-4cf4-ad33-51644e8d8e48
2022/08/26 08:43:23 Job state: STOPPED
2022/08/26 08:43:23 Job state: STARTING
2022/08/26 08:43:23 Job state: RUNNING
2022/08/26 08:44:32 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/26 08:44:32 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/26 08:44:33 Job state: FAILED
2022/08/26 08:44:33 Failed to execute job: job load0tests0go0flink0batch0combine0100826065344-root-0826084322-81fefdc7_5a53caa0-1bb7-4cf4-ad33-51644e8d8e48 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100826065344-root-0826084322-81fefdc7_5a53caa0-1bb7-4cf4-ad33-51644e8d8e48 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15e44e8, 0xc00012e000}, {0x144e05a?, 0x1f427b0?}, {0xc00040be70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 33s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/itq372gfvqjtm
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #630
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/630/display/redirect?page=changes>
Changes:
[chamikaramj] Updates old releases to use archive.apache.org
[noreply] added link to setup instructions in WordCount example (#22832)
[noreply] Bump google.golang.org/api from 0.93.0 to 0.94.0 in /sdks (#22839)
[noreply] Bump cloud.google.com/go/bigquery from 1.38.0 to 1.39.0 in /sdks
[noreply] Add an integration test for bag state (#22827)
[noreply] Fix a few linting issues (#22842)
[noreply] Add combining state support (#22826)
[noreply] Bump cloud.google.com/go/pubsub from 1.24.0 to 1.25.1 in /sdks (#22850)
[noreply] Bump google.golang.org/grpc from 1.48.0 to 1.49.0 in /sdks (#22838)
[noreply] [Website] update videos section (#22772)
[noreply] Update Dataflow fnapi_container-version (#22852)
[noreply] Go SDK Katas: Update beam module dependency (#22753)
[noreply] unskip sklearn IT test (#22825)
------------------------------------------
[...truncated 33.69 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/25 08:43:33 Using specified **** binary: 'linux_amd64/combine'
2022/08/25 08:43:34 Prepared job with id: load-tests-go-flink-batch-combine-1-0825065325_d2be02cd-ea0e-4854-ba64-c3e835b66729 and staging token: load-tests-go-flink-batch-combine-1-0825065325_d2be02cd-ea0e-4854-ba64-c3e835b66729
2022/08/25 08:43:38 Staged binary artifact with token:
2022/08/25 08:43:39 Submitted job: load0tests0go0flink0batch0combine0100825065325-root-0825084338-9407e1_0b5a9daa-e906-4773-94a1-72e6154e4d56
2022/08/25 08:43:39 Job state: STOPPED
2022/08/25 08:43:39 Job state: STARTING
2022/08/25 08:43:39 Job state: RUNNING
2022/08/25 08:44:48 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/25 08:44:48 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/25 08:44:48 Job state: FAILED
2022/08/25 08:44:48 Failed to execute job: job load0tests0go0flink0batch0combine0100825065325-root-0825084338-9407e1_0b5a9daa-e906-4773-94a1-72e6154e4d56 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100825065325-root-0825084338-9407e1_0b5a9daa-e906-4773-94a1-72e6154e4d56 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15e44e8, 0xc00004a0c0}, {0x144e05a?, 0x1f427b0?}, {0xc000161e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 48s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/lnechirkgchk2
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #629
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/629/display/redirect?page=changes>
Changes:
[Kenneth Knowles] Eliminate some null errors and rawtypes from sdks/java/core
[Kiley Sok] Update Beam 2.41.0 release docs
[noreply] [Playground] Setup Datastore in Playground project using Terraform -
[noreply] Add bag state support (#22816)
[Kiley Sok] Fix dates for 2.41.0 release
------------------------------------------
[...truncated 33.79 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/24 08:43:39 Using specified **** binary: 'linux_amd64/combine'
2022/08/24 08:43:39 Prepared job with id: load-tests-go-flink-batch-combine-1-0824065307_06df86ec-1133-4cce-8d1a-d6295c8f4216 and staging token: load-tests-go-flink-batch-combine-1-0824065307_06df86ec-1133-4cce-8d1a-d6295c8f4216
2022/08/24 08:43:44 Staged binary artifact with token:
2022/08/24 08:43:45 Submitted job: load0tests0go0flink0batch0combine0100824065307-root-0824084344-356e1691_7be1a3a3-016c-4ee3-84fc-ae0ba2b3e670
2022/08/24 08:43:45 Job state: STOPPED
2022/08/24 08:43:45 Job state: STARTING
2022/08/24 08:43:45 Job state: RUNNING
2022/08/24 08:44:53 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/24 08:44:53 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/24 08:44:53 Job state: FAILED
2022/08/24 08:44:53 Failed to execute job: job load0tests0go0flink0batch0combine0100824065307-root-0824084344-356e1691_7be1a3a3-016c-4ee3-84fc-ae0ba2b3e670 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100824065307-root-0824084344-356e1691_7be1a3a3-016c-4ee3-84fc-ae0ba2b3e670 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15df648, 0xc000136000}, {0x1449f3b?, 0x1f3b7b8?}, {0xc00072fe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 45s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/tzcdoqrf6gbdi
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #628
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/628/display/redirect?page=changes>
Changes:
[yathu] Evaluate proper metric in TextIOIT
[Andrew Pilloud] Add Python nexmark to gradle
[Michael Luckey] Align neo4j error messages with API
[noreply] Add Release category to release announcement blogs (#22785)
[noreply] [BEAM-13657] Update Python version used by mypy. (#22804)
[noreply] E2E basic state support (#22798)
[noreply] Add state integration test (#22815)
------------------------------------------
[...truncated 33.87 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/23 08:43:57 Using specified **** binary: 'linux_amd64/combine'
2022/08/23 08:43:58 Prepared job with id: load-tests-go-flink-batch-combine-1-0823065314_727be7f7-2b1c-4d83-9cd6-c50f9d47d9f2 and staging token: load-tests-go-flink-batch-combine-1-0823065314_727be7f7-2b1c-4d83-9cd6-c50f9d47d9f2
2022/08/23 08:44:02 Staged binary artifact with token:
2022/08/23 08:44:03 Submitted job: load0tests0go0flink0batch0combine0100823065314-root-0823084402-283b41d1_9af37143-a21a-4e38-8304-4bc48699dc0b
2022/08/23 08:44:03 Job state: STOPPED
2022/08/23 08:44:03 Job state: STARTING
2022/08/23 08:44:03 Job state: RUNNING
2022/08/23 08:45:12 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/23 08:45:12 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/23 08:45:12 Job state: FAILED
2022/08/23 08:45:12 Failed to execute job: job load0tests0go0flink0batch0combine0100823065314-root-0823084402-283b41d1_9af37143-a21a-4e38-8304-4bc48699dc0b failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100823065314-root-0823084402-283b41d1_9af37143-a21a-4e38-8304-4bc48699dc0b failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15de188, 0xc00004a0c0}, {0x1448c5b?, 0x1f397b8?}, {0xc0004ffe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 48s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/33prllb7mqdwc
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #627
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/627/display/redirect?page=changes>
Changes:
[noreply] Bump cloud.google.com/go/bigquery from 1.37.0 to 1.38.0 in /sdks
------------------------------------------
[...truncated 33.76 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/22 08:43:22 Using specified **** binary: 'linux_amd64/combine'
2022/08/22 08:43:23 Prepared job with id: load-tests-go-flink-batch-combine-1-0822065316_094907fa-4054-4985-b97b-926cc641e03b and staging token: load-tests-go-flink-batch-combine-1-0822065316_094907fa-4054-4985-b97b-926cc641e03b
2022/08/22 08:43:27 Staged binary artifact with token:
2022/08/22 08:43:28 Submitted job: load0tests0go0flink0batch0combine0100822065316-root-0822084327-3ddc7c9f_223129b8-335c-415c-abdd-3337be538934
2022/08/22 08:43:28 Job state: STOPPED
2022/08/22 08:43:28 Job state: STARTING
2022/08/22 08:43:28 Job state: RUNNING
2022/08/22 08:44:36 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/22 08:44:36 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/22 08:44:37 Job state: FAILED
2022/08/22 08:44:37 Failed to execute job: job load0tests0go0flink0batch0combine0100822065316-root-0822084327-3ddc7c9f_223129b8-335c-415c-abdd-3337be538934 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100822065316-root-0822084327-3ddc7c9f_223129b8-335c-415c-abdd-3337be538934 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15da028, 0xc00012e000}, {0x1444bc6?, 0x1f347b8?}, {0xc000367e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 40s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/yqayjfzvdkexw
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #626
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/626/display/redirect>
Changes:
------------------------------------------
[...truncated 33.73 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/21 08:43:23 Using specified **** binary: 'linux_amd64/combine'
2022/08/21 08:43:24 Prepared job with id: load-tests-go-flink-batch-combine-1-0821065311_aa26fb3b-9dc0-40e4-9af1-ce6b98f24f82 and staging token: load-tests-go-flink-batch-combine-1-0821065311_aa26fb3b-9dc0-40e4-9af1-ce6b98f24f82
2022/08/21 08:43:28 Staged binary artifact with token:
2022/08/21 08:43:29 Submitted job: load0tests0go0flink0batch0combine0100821065311-root-0821084328-95cd527_99abd165-d830-4b89-83bf-414c9d3e2242
2022/08/21 08:43:29 Job state: STOPPED
2022/08/21 08:43:29 Job state: STARTING
2022/08/21 08:43:29 Job state: RUNNING
2022/08/21 08:44:38 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/21 08:44:38 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/21 08:44:38 Job state: FAILED
2022/08/21 08:44:38 Failed to execute job: job load0tests0go0flink0batch0combine0100821065311-root-0821084328-95cd527_99abd165-d830-4b89-83bf-414c9d3e2242 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100821065311-root-0821084328-95cd527_99abd165-d830-4b89-83bf-414c9d3e2242 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15d8788, 0xc00004a0c0}, {0x14434e5?, 0x1f314d0?}, {0xc00029de70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 35s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/ytw7nwt32kzny
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #625
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/625/display/redirect?page=changes>
Changes:
[bulat.safiullin] [Website] add scroll to new position if anchor is present #22699
[randomstep] [BEAM-8701] bump commons-io to 2.7
[bulat.safiullin] [Website] remove text from Available contact channels table #22696
[bulat.safiullin] [Website] update commits link #22520
[noreply] [Go SDK] Fix go lint errors (#22796)
[noreply] Modify RunInference to return PipelineResult for the benchmark tests
[noreply] Fix lint issues (#22800)
------------------------------------------
[...truncated 33.63 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/20 08:43:16 Using specified **** binary: 'linux_amd64/combine'
2022/08/20 08:43:17 Prepared job with id: load-tests-go-flink-batch-combine-1-0820065311_b57e0c58-3748-4838-b01c-66c822d9a051 and staging token: load-tests-go-flink-batch-combine-1-0820065311_b57e0c58-3748-4838-b01c-66c822d9a051
2022/08/20 08:43:21 Staged binary artifact with token:
2022/08/20 08:43:22 Submitted job: load0tests0go0flink0batch0combine0100820065311-root-0820084321-621474db_6ed83271-fb21-470c-9148-257a2cba08fb
2022/08/20 08:43:22 Job state: STOPPED
2022/08/20 08:43:22 Job state: STARTING
2022/08/20 08:43:22 Job state: RUNNING
2022/08/20 08:44:31 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/20 08:44:31 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/20 08:44:31 Job state: FAILED
2022/08/20 08:44:31 Failed to execute job: job load0tests0go0flink0batch0combine0100820065311-root-0820084321-621474db_6ed83271-fb21-470c-9148-257a2cba08fb failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100820065311-root-0820084321-621474db_6ed83271-fb21-470c-9148-257a2cba08fb failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15d8788, 0xc00012e000}, {0x14434e5?, 0x1f314d0?}, {0xc000625e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 35s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/5yboi5fyfzdhw
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #624
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/624/display/redirect?page=changes>
Changes:
[cushon] Downgrade bytebuddy version to 1.11.0
[noreply] Label kata changes with the language they're modifying (#22764)
[noreply] [Website] Add GitHub issue link (#22774)
[noreply] Fix some typos in the ML doc (#22763)
[noreply] Go stateful DoFns user side changes (#22761)
[noreply] fixed column width in tables in Getting started from Spark guide
[noreply] Testing authentication for Playground (#22782)
[noreply] [BEAM-12776, fixes #21095] Limit parallel closes from the prior element
[noreply] [BEAM-13015, #21250] Reuse buffers when possible when writing on
------------------------------------------
[...truncated 33.71 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/19 08:43:23 Using specified **** binary: 'linux_amd64/combine'
2022/08/19 08:43:23 Prepared job with id: load-tests-go-flink-batch-combine-1-0819065306_05d346c9-75b2-4fd9-bc21-c6ae34268943 and staging token: load-tests-go-flink-batch-combine-1-0819065306_05d346c9-75b2-4fd9-bc21-c6ae34268943
2022/08/19 08:43:28 Staged binary artifact with token:
2022/08/19 08:43:29 Submitted job: load0tests0go0flink0batch0combine0100819065306-root-0819084328-5f2994cb_2efeadbf-3242-40ca-bc66-ce37cda0b794
2022/08/19 08:43:29 Job state: STOPPED
2022/08/19 08:43:29 Job state: STARTING
2022/08/19 08:43:29 Job state: RUNNING
2022/08/19 08:44:37 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/19 08:44:37 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/19 08:44:38 Job state: FAILED
2022/08/19 08:44:38 Failed to execute job: job load0tests0go0flink0batch0combine0100819065306-root-0819084328-5f2994cb_2efeadbf-3242-40ca-bc66-ce37cda0b794 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100819065306-root-0819084328-5f2994cb_2efeadbf-3242-40ca-bc66-ce37cda0b794 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15d8788, 0xc00004a0c0}, {0x14434e5?, 0x1f314d0?}, {0xc00026be70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/unzibgjliikca
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #623
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/623/display/redirect?page=changes>
Changes:
[noreply] Bump google.golang.org/api from 0.92.0 to 0.93.0 in /sdks (#22752)
[noreply] Fix direct running mode multi_processing on win32 (#22730)
[noreply] Improve error message on schema issues (#22469)
[noreply] sklearn runinference regression example (#22088)
[noreply] [Website] add intuit case-study, add intuit quote-card (#22757)
[noreply] Avoid panic on type assert. (#22767)
[noreply] [#21935] Reject ill formed GroupByKey coders during pipeline.run
[noreply] Don't use batch interface for single object operations (#22432)
------------------------------------------
[...truncated 33.69 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/18 08:43:27 Using specified **** binary: 'linux_amd64/combine'
2022/08/18 08:43:27 Prepared job with id: load-tests-go-flink-batch-combine-1-0818065313_f369f790-7eb6-43e0-b8ac-6186029d2729 and staging token: load-tests-go-flink-batch-combine-1-0818065313_f369f790-7eb6-43e0-b8ac-6186029d2729
2022/08/18 08:43:31 Staged binary artifact with token:
2022/08/18 08:43:32 Submitted job: load0tests0go0flink0batch0combine0100818065313-root-0818084332-e48e3b0c_f73ad89a-c270-454a-97ac-2692416ba030
2022/08/18 08:43:32 Job state: STOPPED
2022/08/18 08:43:32 Job state: STARTING
2022/08/18 08:43:32 Job state: RUNNING
2022/08/18 08:44:41 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/18 08:44:41 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/18 08:44:42 Job state: FAILED
2022/08/18 08:44:42 Failed to execute job: job load0tests0go0flink0batch0combine0100818065313-root-0818084332-e48e3b0c_f73ad89a-c270-454a-97ac-2692416ba030 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100818065313-root-0818084332-e48e3b0c_f73ad89a-c270-454a-97ac-2692416ba030 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15d5188, 0xc00004a0c0}, {0x1440b3f?, 0x1f2d470?}, {0xc000463e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 46s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/ccomas5tdierm
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #622
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/622/display/redirect?page=changes>
Changes:
[noreply] Handle single-precision float values in the standard coders tests
[noreply] [BEAM-13015, #21250] Remove looking up thread local metrics container
[noreply] [fixes #22731] Publish nightly snapshot of legacy Dataflow worker jar.
[andyye333] Remove assert
[noreply] [fixes #22744] Update hadoop library patch versions to 2.10.2 and 3.2.4
[noreply] Update beam-master version for legacy (#22741)
------------------------------------------
[...truncated 33.84 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/17 08:43:39 Using specified **** binary: 'linux_amd64/combine'
2022/08/17 08:43:40 Prepared job with id: load-tests-go-flink-batch-combine-1-0817065313_db3d2246-9aaa-4b09-b4f8-a2ad599543a1 and staging token: load-tests-go-flink-batch-combine-1-0817065313_db3d2246-9aaa-4b09-b4f8-a2ad599543a1
2022/08/17 08:43:44 Staged binary artifact with token:
2022/08/17 08:43:45 Submitted job: load0tests0go0flink0batch0combine0100817065313-root-0817084344-4f97481e_5b56c5c8-9cec-4925-b41d-2e160dbece6c
2022/08/17 08:43:45 Job state: STOPPED
2022/08/17 08:43:45 Job state: STARTING
2022/08/17 08:43:45 Job state: RUNNING
2022/08/17 08:44:54 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/17 08:44:54 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/17 08:44:54 Job state: FAILED
2022/08/17 08:44:54 Failed to execute job: job load0tests0go0flink0batch0combine0100817065313-root-0817084344-4f97481e_5b56c5c8-9cec-4925-b41d-2e160dbece6c failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100817065313-root-0817084344-4f97481e_5b56c5c8-9cec-4925-b41d-2e160dbece6c failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15d6188, 0xc00004a0c0}, {0x1441b3f?, 0x1f2e470?}, {0xc0004cde70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 35s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/e6hnxr267wvoc
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #621
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/621/display/redirect?page=changes>
Changes:
[Steve Niemitz] Fix UpdateSchemaDestination when source format is set to AVRO
[noreply] Attempt to fix SpannerIO test flakes (#22688)
[noreply] Add a dataflow override for runnerv1 to still use SDF on runnerv2.
[noreply] [Playground] Result filter bug (#22215)
[noreply] [Website] update case-studies layout (#22342)
[noreply] Implement KafkaSchemaTransformReadConfiguration (#22403)
------------------------------------------
[...truncated 33.89 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/16 08:43:58 Using specified **** binary: 'linux_amd64/combine'
2022/08/16 08:43:58 Prepared job with id: load-tests-go-flink-batch-combine-1-0816065314_a78715a3-cad6-4220-bb24-d5ae6594503c and staging token: load-tests-go-flink-batch-combine-1-0816065314_a78715a3-cad6-4220-bb24-d5ae6594503c
2022/08/16 08:44:03 Staged binary artifact with token:
2022/08/16 08:44:04 Submitted job: load0tests0go0flink0batch0combine0100816065314-root-0816084403-2f6415d5_8f984b4e-9a25-4d96-bd5c-a7856318bf8b
2022/08/16 08:44:04 Job state: STOPPED
2022/08/16 08:44:04 Job state: STARTING
2022/08/16 08:44:04 Job state: RUNNING
2022/08/16 08:45:13 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/16 08:45:13 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/16 08:45:14 Job state: FAILED
2022/08/16 08:45:14 Failed to execute job: job load0tests0go0flink0batch0combine0100816065314-root-0816084403-2f6415d5_8f984b4e-9a25-4d96-bd5c-a7856318bf8b failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100816065314-root-0816084403-2f6415d5_8f984b4e-9a25-4d96-bd5c-a7856318bf8b failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15d6188, 0xc00012e000}, {0x1441b3f?, 0x1f2e470?}, {0xc00056be70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/wcj7hj5gwlpvk
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #620
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/620/display/redirect?page=changes>
Changes:
[noreply] fix minor unreachable code caused by log.Fatal (#22618)
------------------------------------------
[...truncated 33.64 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/15 08:43:14 Using specified **** binary: 'linux_amd64/combine'
2022/08/15 08:43:14 Prepared job with id: load-tests-go-flink-batch-combine-1-0815065307_df76d92e-61ed-4eb1-83d3-98e45f3862cd and staging token: load-tests-go-flink-batch-combine-1-0815065307_df76d92e-61ed-4eb1-83d3-98e45f3862cd
2022/08/15 08:43:19 Staged binary artifact with token:
2022/08/15 08:43:20 Submitted job: load0tests0go0flink0batch0combine0100815065307-root-0815084319-c5e765a7_e3f66643-8da2-462e-b868-bb81f48860ba
2022/08/15 08:43:20 Job state: STOPPED
2022/08/15 08:43:20 Job state: STARTING
2022/08/15 08:43:20 Job state: RUNNING
2022/08/15 08:44:29 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/15 08:44:29 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/15 08:44:29 Job state: FAILED
2022/08/15 08:44:29 Failed to execute job: job load0tests0go0flink0batch0combine0100815065307-root-0815084319-c5e765a7_e3f66643-8da2-462e-b868-bb81f48860ba failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100815065307-root-0815084319-c5e765a7_e3f66643-8da2-462e-b868-bb81f48860ba failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15d6188, 0xc00004a0c0}, {0x1441b3f?, 0x1f2e470?}, {0xc000629e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 34s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/75zvw6hlqph3q
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #619
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/619/display/redirect>
Changes:
------------------------------------------
[...truncated 33.57 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/14 08:43:01 Using specified **** binary: 'linux_amd64/combine'
2022/08/14 08:43:02 Prepared job with id: load-tests-go-flink-batch-combine-1-0814065306_13769be9-ae91-44cf-83a6-6938d6c63e80 and staging token: load-tests-go-flink-batch-combine-1-0814065306_13769be9-ae91-44cf-83a6-6938d6c63e80
2022/08/14 08:43:05 Staged binary artifact with token:
2022/08/14 08:43:07 Submitted job: load0tests0go0flink0batch0combine0100814065306-root-0814084306-2f74f224_e47533f0-f141-4707-a022-f784c16d605d
2022/08/14 08:43:07 Job state: STOPPED
2022/08/14 08:43:07 Job state: STARTING
2022/08/14 08:43:07 Job state: RUNNING
2022/08/14 08:44:16 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/14 08:44:16 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/14 08:44:16 Job state: FAILED
2022/08/14 08:44:16 Failed to execute job: job load0tests0go0flink0batch0combine0100814065306-root-0814084306-2f74f224_e47533f0-f141-4707-a022-f784c16d605d failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100814065306-root-0814084306-2f74f224_e47533f0-f141-4707-a022-f784c16d605d failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15d6188, 0xc00012e000}, {0x1441b3f?, 0x1f2e470?}, {0xc000261e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 31s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/y64mxpvqp4bg2
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #618
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/618/display/redirect?page=changes>
Changes:
[yathu] Bump mongo_java_driver to 3.12.11 and embed.mongo to 3.0.0
[bulat.safiullin] [Website] add container with overflow-x to runners with table #22708
[noreply] Bump cloud.google.com/go/storage from 1.24.0 to 1.25.0 in /sdks (#22705)
[noreply] [Go SDK]: Implement standalone single-precision float encoder (#22664)
[noreply] [Playground] [Backend] added validation for snippet endpoints to avoid
[noreply] Add GeneratedClassRowTypeConstraint (#22679)
[noreply] [Playground] [Backend] Removing unused snippets manually and using the
[noreply] Implement PubsubSchemaTransformWriteConfiguration (#22262)
[noreply] Add support for FLOAT to Python RowCoder (#22626)
[noreply] Bump up python container versions (#22697)
------------------------------------------
[...truncated 33.70 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/13 08:43:36 Using specified **** binary: 'linux_amd64/combine'
2022/08/13 08:43:37 Prepared job with id: load-tests-go-flink-batch-combine-1-0813065310_3a68b231-c2af-4457-92e9-b4f675c386fb and staging token: load-tests-go-flink-batch-combine-1-0813065310_3a68b231-c2af-4457-92e9-b4f675c386fb
2022/08/13 08:43:41 Staged binary artifact with token:
2022/08/13 08:43:43 Submitted job: load0tests0go0flink0batch0combine0100813065310-root-0813084341-9aec1ca5_e5881845-7aec-48a1-9247-429f33faa78d
2022/08/13 08:43:43 Job state: STOPPED
2022/08/13 08:43:43 Job state: STARTING
2022/08/13 08:43:43 Job state: RUNNING
2022/08/13 08:44:52 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/13 08:44:52 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/13 08:44:52 Job state: FAILED
2022/08/13 08:44:52 Failed to execute job: job load0tests0go0flink0batch0combine0100813065310-root-0813084341-9aec1ca5_e5881845-7aec-48a1-9247-429f33faa78d failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100813065310-root-0813084341-9aec1ca5_e5881845-7aec-48a1-9247-429f33faa78d failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15d6188, 0xc00004a0c0}, {0x1441b3f?, 0x1f2e470?}, {0xc00034fe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 45s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/zren2cl47ozti
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #617
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/617/display/redirect?page=changes>
Changes:
[noreply] Fix seed job (#22687)
[noreply] Bump actions/stale from 3 to 5 (#22684)
[noreply] Bump actions/upload-artifact from 2 to 3 (#22682)
[noreply] Bump actions/download-artifact from 2 to 3 (#22683)
[noreply] Add shunts for Beam typehints (#22680)
[noreply] Fix wordcount setup-java (#22700)
[noreply] Bump google.golang.org/api from 0.91.0 to 0.92.0 in /sdks (#22681)
------------------------------------------
[...truncated 33.70 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/12 08:43:29 Using specified **** binary: 'linux_amd64/combine'
2022/08/12 08:43:30 Prepared job with id: load-tests-go-flink-batch-combine-1-0812065316_b7d34318-0cda-439a-8bae-ad1c441651a5 and staging token: load-tests-go-flink-batch-combine-1-0812065316_b7d34318-0cda-439a-8bae-ad1c441651a5
2022/08/12 08:43:34 Staged binary artifact with token:
2022/08/12 08:43:35 Submitted job: load0tests0go0flink0batch0combine0100812065316-root-0812084334-fe4a13c7_0f7d6365-05ec-4199-8095-da532295cba4
2022/08/12 08:43:35 Job state: STOPPED
2022/08/12 08:43:35 Job state: STARTING
2022/08/12 08:43:35 Job state: RUNNING
2022/08/12 08:44:44 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/12 08:44:44 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/12 08:44:44 Job state: FAILED
2022/08/12 08:44:44 Failed to execute job: job load0tests0go0flink0batch0combine0100812065316-root-0812084334-fe4a13c7_0f7d6365-05ec-4199-8095-da532295cba4 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100812065316-root-0812084334-fe4a13c7_0f7d6365-05ec-4199-8095-da532295cba4 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15c2728, 0xc0001a6000}, {0x1430a35?, 0x1f14ef0?}, {0xc0002b9e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 35s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/t2xxdhdldzivo
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #616
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/616/display/redirect?page=changes>
Changes:
[bulat.safiullin] [Website] update contribution content collapse
[noreply] Adhoc: Fix logging in Spark runner to avoid unnecessary creation of
[noreply] Improve exception when requested error tag does not exist (#22401)
[noreply] Reimplement Pub/Sub Lite's I/O using UnboundedSource. (#22612)
[noreply] Clean up checkstyle suppressions.xml (#22649)
[noreply] [Playground] [Infrastructure] format python code style (#22291)
[noreply] Minor: Add helpful names for parameterized dataframe.schemas_test
[noreply] [BEAM-14118, #21639] Use vendored gRPC 1.48.1 (#22628)
[Ismaël Mejía] Fix #22466 Add github actions dependency updates with dependabot
[noreply] Change Python PostCommits timeout (#22655)
[noreply] Revert "Persist ghprbPullId parameter in seed job (#22579)" (#22656)
[noreply] Bump actions/setup-java from 2 to 3 (#22666)
[noreply] Bump actions/labeler from 3 to 4 (#22670)
[noreply] Bump actions/setup-node from 2 to 3 (#22671)
[noreply] Bump actions/setup-go from 2 to 3 (#22669)
[noreply] Bump actions/setup-python from 2 to 4 (#22668)
[noreply] Bump actions/checkout from 2 to 3 (#22667)
[noreply] Fix broken link to Retry Policy blog (#22554)
[noreply] Include total in header of issue report (#22475)
[chamikaramj] Update vendored gRPC version for SpannerTransformRegistrarTest
[noreply] [Playground] Share any code feature frontend (#22477)
[noreply] Remove subprocess.PIPE usage by using a temp file (#22654)
[noreply] [#22647] Upgrade org.apache.samza to 1.6 (#22648)
------------------------------------------
[...truncated 33.80 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/11 08:43:54 Using specified **** binary: 'linux_amd64/combine'
2022/08/11 08:43:55 Prepared job with id: load-tests-go-flink-batch-combine-1-0809170707_14dddc35-5b5a-4dc4-922d-d5227c6a67b8 and staging token: load-tests-go-flink-batch-combine-1-0809170707_14dddc35-5b5a-4dc4-922d-d5227c6a67b8
2022/08/11 08:43:59 Staged binary artifact with token:
2022/08/11 08:44:00 Submitted job: load0tests0go0flink0batch0combine0100809170707-root-0811084359-f2dd5eef_2e189742-8698-4c6b-8abd-2d8b64158e9a
2022/08/11 08:44:00 Job state: STOPPED
2022/08/11 08:44:00 Job state: STARTING
2022/08/11 08:44:00 Job state: RUNNING
2022/08/11 08:45:08 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/11 08:45:08 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/11 08:45:09 Job state: FAILED
2022/08/11 08:45:09 Failed to execute job: job load0tests0go0flink0batch0combine0100809170707-root-0811084359-f2dd5eef_2e189742-8698-4c6b-8abd-2d8b64158e9a failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100809170707-root-0811084359-f2dd5eef_2e189742-8698-4c6b-8abd-2d8b64158e9a failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15c2728, 0xc00004a0c0}, {0x1430a35?, 0x1f14ef0?}, {0xc000363e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/hwuhepf5ybk42
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #615
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/615/display/redirect?page=changes>
Changes:
[alexey.inkin] Fix retaining unsaved pipeline options (#22075)
[108862444+oborysevych] removed VladMatyunin from beam collaborators
[anandinguva98] Add stdlib distutils while building the wheels
[noreply] Bump google.golang.org/api from 0.90.0 to 0.91.0 in /sdks (#22568)
[noreply] Fix for #22631 KafkaIO considers readCommitted() as it would commit back
[noreply] [CdapIO] Add CdapIO dashboard in Grafana (#22641)
[noreply] Add information on how to take/close issues in the contribution guide.
[noreply] Skip
[noreply] Persist ghprbPullId parameter in seed job (#22579)
------------------------------------------
[...truncated 33.71 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/10 08:43:25 Using specified **** binary: 'linux_amd64/combine'
2022/08/10 08:43:26 Prepared job with id: load-tests-go-flink-batch-combine-1-0809170707_345ebc68-c124-41bb-bcea-a9bbbf3d5fea and staging token: load-tests-go-flink-batch-combine-1-0809170707_345ebc68-c124-41bb-bcea-a9bbbf3d5fea
2022/08/10 08:43:30 Staged binary artifact with token:
2022/08/10 08:43:31 Submitted job: load0tests0go0flink0batch0combine0100809170707-root-0810084330-afed3a6_c9841ffe-2a2b-41e4-98bf-6e430c2021fd
2022/08/10 08:43:31 Job state: STOPPED
2022/08/10 08:43:31 Job state: STARTING
2022/08/10 08:43:31 Job state: RUNNING
2022/08/10 08:44:40 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/10 08:44:40 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/10 08:44:40 Job state: FAILED
2022/08/10 08:44:40 Failed to execute job: job load0tests0go0flink0batch0combine0100809170707-root-0810084330-afed3a6_c9841ffe-2a2b-41e4-98bf-6e430c2021fd failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100809170707-root-0810084330-afed3a6_c9841ffe-2a2b-41e4-98bf-6e430c2021fd failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15c2728, 0xc00004a0c0}, {0x1430a35?, 0x1f14ef0?}, {0xc0006e3e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 39s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/lm5ktauxwf4tm
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #614
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/614/display/redirect?page=changes>
Changes:
[vlad.matyunin] modifed WithKeys Playground Example
[alexander.zhuravlev] [Playground] Removed banner from Playground header, deleted unused
[shivam] Add example for `Distinct` PTransform
[manitgupta] Fix bug in StructUtils
[noreply] Add PyDoc buttons to the top and bottom of the Machine Learning page
[noreply] [Playground][Backend][Bug]: Moving the initialization of properties file
[noreply] Bump cloud.google.com/go/bigquery from 1.36.0 to 1.37.0 in /sdks
[noreply] Minor: Clean up an assertion in schemas_test (#22613)
[noreply] Exclude testWithShardedKeyInGlobalWindow on streaming runner v1 (#22593)
[noreply] Pub/Sub Schema Transform Read Provider (#22145)
[noreply] Update BigQuery URI validation to allow more valid URIs through (#22452)
[noreply] Add units tests for SpannerIO (#22428)
------------------------------------------
[...truncated 33.84 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/09 08:44:50 Using specified **** binary: 'linux_amd64/combine'
2022/08/09 08:44:50 Prepared job with id: load-tests-go-flink-batch-combine-1-0809065307_458917ae-00b0-4d05-890c-5e7401216821 and staging token: load-tests-go-flink-batch-combine-1-0809065307_458917ae-00b0-4d05-890c-5e7401216821
2022/08/09 08:44:57 Staged binary artifact with token:
2022/08/09 08:44:58 Submitted job: load0tests0go0flink0batch0combine0100809065307-root-0809084457-505e4227_47c50270-7468-4429-80c9-e8dcf256d58e
2022/08/09 08:44:58 Job state: STOPPED
2022/08/09 08:44:58 Job state: STARTING
2022/08/09 08:44:58 Job state: RUNNING
2022/08/09 08:46:07 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/09 08:46:07 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/09 08:46:07 Job state: FAILED
2022/08/09 08:46:07 Failed to execute job: job load0tests0go0flink0batch0combine0100809065307-root-0809084457-505e4227_47c50270-7468-4429-80c9-e8dcf256d58e failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100809065307-root-0809084457-505e4227_47c50270-7468-4429-80c9-e8dcf256d58e failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15c2728, 0xc0001a6000}, {0x1430a35?, 0x1f14ef0?}, {0xc0005efe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 2m 25s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/m7ayqzecp6hri
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #613
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/613/display/redirect>
Changes:
------------------------------------------
[...truncated 33.69 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/08 08:43:25 Using specified **** binary: 'linux_amd64/combine'
2022/08/08 08:43:26 Prepared job with id: load-tests-go-flink-batch-combine-1-0808065313_501dc81e-9213-4e9a-957c-da2eeed67d7c and staging token: load-tests-go-flink-batch-combine-1-0808065313_501dc81e-9213-4e9a-957c-da2eeed67d7c
2022/08/08 08:43:30 Staged binary artifact with token:
2022/08/08 08:43:31 Submitted job: load0tests0go0flink0batch0combine0100808065313-root-0808084330-8ea454ba_e9ff4674-2c20-4bbe-8f85-0e0f25fd3090
2022/08/08 08:43:31 Job state: STOPPED
2022/08/08 08:43:31 Job state: STARTING
2022/08/08 08:43:31 Job state: RUNNING
2022/08/08 08:44:40 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/08 08:44:40 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/08 08:44:41 Job state: FAILED
2022/08/08 08:44:41 Failed to execute job: job load0tests0go0flink0batch0combine0100808065313-root-0808084330-8ea454ba_e9ff4674-2c20-4bbe-8f85-0e0f25fd3090 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100808065313-root-0808084330-8ea454ba_e9ff4674-2c20-4bbe-8f85-0e0f25fd3090 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15c2728, 0xc00012e000}, {0x1430a35?, 0x1f14ef0?}, {0xc0000e3e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 31s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/d7miqykwpyzqu
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #612
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/612/display/redirect>
Changes:
------------------------------------------
[...truncated 33.56 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/07 08:42:57 Using specified **** binary: 'linux_amd64/combine'
2022/08/07 08:42:58 Prepared job with id: load-tests-go-flink-batch-combine-1-0807065309_495ead3e-a35f-4581-9649-4e843847917f and staging token: load-tests-go-flink-batch-combine-1-0807065309_495ead3e-a35f-4581-9649-4e843847917f
2022/08/07 08:43:01 Staged binary artifact with token:
2022/08/07 08:43:03 Submitted job: load0tests0go0flink0batch0combine0100807065309-root-0807084302-b0d760b3_4f4665d0-8d9d-4c2c-988e-e885a1d57dd1
2022/08/07 08:43:03 Job state: STOPPED
2022/08/07 08:43:03 Job state: STARTING
2022/08/07 08:43:03 Job state: RUNNING
2022/08/07 08:44:11 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/07 08:44:11 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/07 08:44:11 Job state: FAILED
2022/08/07 08:44:11 Failed to execute job: job load0tests0go0flink0batch0combine0100807065309-root-0807084302-b0d760b3_4f4665d0-8d9d-4c2c-988e-e885a1d57dd1 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100807065309-root-0807084302-b0d760b3_4f4665d0-8d9d-4c2c-988e-e885a1d57dd1 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15c2728, 0xc00012e000}, {0x1430a35?, 0x1f14ef0?}, {0xc0006fde70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 32s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/hg3wqc73rv6io
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #611
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/611/display/redirect?page=changes>
Changes:
[yathu] Moving misplaced CHANGES from template to 2.41.0
[noreply] [BEAM-14117] Delete vendored bytebuddy gradle build (#22594)
[noreply] Add Import transform to Go FhirIO (#22460)
[noreply] Allow unsafe triggers for python nexmark benchmarks (#22596)
[noreply] pubsublite: Fix max offset for computing backlog (#22585)
[noreply] Add support when writing to locked buckets by handling
[noreply] [BEAM-14118, #21639] Vendor gRPC 1.48.1 (#22607)
[noreply] [21894] Validates inference_args early (#22282)
[noreply] Return type for _ExpandIntoRanges DoFn should be Iterable. (#22548)
------------------------------------------
[...truncated 33.75 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/06 08:43:41 Using specified **** binary: 'linux_amd64/combine'
2022/08/06 08:43:41 Prepared job with id: load-tests-go-flink-batch-combine-1-0806065309_4f382259-c1cb-40a3-b10b-94b7e6aa72c2 and staging token: load-tests-go-flink-batch-combine-1-0806065309_4f382259-c1cb-40a3-b10b-94b7e6aa72c2
2022/08/06 08:43:45 Staged binary artifact with token:
2022/08/06 08:43:46 Submitted job: load0tests0go0flink0batch0combine0100806065309-root-0806084346-c57db71e_dbe05ada-9e5d-426e-ab82-1582f0ea0570
2022/08/06 08:43:46 Job state: STOPPED
2022/08/06 08:43:46 Job state: STARTING
2022/08/06 08:43:46 Job state: RUNNING
2022/08/06 08:44:56 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/06 08:44:56 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/06 08:44:56 Job state: FAILED
2022/08/06 08:44:56 Failed to execute job: job load0tests0go0flink0batch0combine0100806065309-root-0806084346-c57db71e_dbe05ada-9e5d-426e-ab82-1582f0ea0570 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100806065309-root-0806084346-c57db71e_dbe05ada-9e5d-426e-ab82-1582f0ea0570 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15c2728, 0xc00004a0c0}, {0x1430a35?, 0x1f14ef0?}, {0xc000311e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 34s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/gzr5uaudjjfsq
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #610
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/610/display/redirect?page=changes>
Changes:
[noreply] Update Dataflow container version (#22580)
[noreply] Merge pull request #22347: [22188]Set allowed timestamp skew
[noreply] Added experimental annotation to fixes #22564 (#22565)
------------------------------------------
[...truncated 33.74 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/05 08:43:48 Using specified **** binary: 'linux_amd64/combine'
2022/08/05 08:43:48 Prepared job with id: load-tests-go-flink-batch-combine-1-0805065311_5bc8addb-08a5-433a-a527-b1240f36d8d2 and staging token: load-tests-go-flink-batch-combine-1-0805065311_5bc8addb-08a5-433a-a527-b1240f36d8d2
2022/08/05 08:43:52 Staged binary artifact with token:
2022/08/05 08:43:53 Submitted job: load0tests0go0flink0batch0combine0100805065311-root-0805084352-f99fd5eb_ae55c7b6-5d22-452a-8353-2e18f6596ab8
2022/08/05 08:43:53 Job state: STOPPED
2022/08/05 08:43:53 Job state: STARTING
2022/08/05 08:43:53 Job state: RUNNING
2022/08/05 08:45:03 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/05 08:45:03 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/05 08:45:03 Job state: FAILED
2022/08/05 08:45:03 Failed to execute job: job load0tests0go0flink0batch0combine0100805065311-root-0805084352-f99fd5eb_ae55c7b6-5d22-452a-8353-2e18f6596ab8 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100805065311-root-0805084352-f99fd5eb_ae55c7b6-5d22-452a-8353-2e18f6596ab8 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15c2728, 0xc000198000}, {0x1430a35?, 0x1f14ef0?}, {0xc000399e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 36s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/fhiwn4nf6tkuq
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #609
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/609/display/redirect?page=changes>
Changes:
[chamikaramj] Mention Java RunInference support in the Website
[noreply] Update run_inference_basic.ipynb
[noreply] Update CHANGE.md after 2.41.0 cut (#22577)
[noreply] Convert to BeamSchema type from ReadfromBQ (#17159)
[noreply] Fix deleteTimer in InMemoryTimerInternals and enable VR tests for
------------------------------------------
[...truncated 33.67 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/04 08:43:52 Using specified **** binary: 'linux_amd64/combine'
2022/08/04 08:43:53 Prepared job with id: load-tests-go-flink-batch-combine-1-0804081332_56835c73-7988-4bbf-b61c-6707f1946842 and staging token: load-tests-go-flink-batch-combine-1-0804081332_56835c73-7988-4bbf-b61c-6707f1946842
2022/08/04 08:43:57 Staged binary artifact with token:
2022/08/04 08:43:58 Submitted job: load0tests0go0flink0batch0combine0100804081332-root-0804084357-a382b18f_7e3beb6e-2e93-411f-91a8-d714a463d664
2022/08/04 08:43:58 Job state: STOPPED
2022/08/04 08:43:58 Job state: STARTING
2022/08/04 08:43:58 Job state: RUNNING
2022/08/04 08:45:07 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/04 08:45:07 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/04 08:45:07 Job state: FAILED
2022/08/04 08:45:07 Failed to execute job: job load0tests0go0flink0batch0combine0100804081332-root-0804084357-a382b18f_7e3beb6e-2e93-411f-91a8-d714a463d664 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100804081332-root-0804084357-a382b18f_7e3beb6e-2e93-411f-91a8-d714a463d664 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15c2728, 0xc00012e000}, {0x1430a35?, 0x1f14ef0?}, {0xc000295e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 38s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/w7vzgkxwcx5zm
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #608
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/608/display/redirect?page=changes>
Changes:
[Valentyn Tymofieiev] add zstd compression support according to issue 22393
[Valentyn Tymofieiev] Regenerate the container dependencies.
[noreply] Remove normalization in Pytorch Image Segmentation example (#22371)
[noreply] Downgrade less informative logs during write to files (#22273)
[noreply] Beam ml notebooks (#22510)
[noreply] Add clearer error message for xlang transforms on teh Go Direct Runner
[noreply] [CdapIO] Add integration tests for CdapIO (Batch) (#22313)
[noreply] Bugfix: Fix broken assertion in PipelineTest (#22485)
------------------------------------------
[...truncated 34.40 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/03 08:46:06 Using specified **** binary: 'linux_amd64/combine'
2022/08/03 08:46:07 Prepared job with id: load-tests-go-flink-batch-combine-1-0803065309_3f2b4131-79ab-4843-82bf-9c84336e3eff and staging token: load-tests-go-flink-batch-combine-1-0803065309_3f2b4131-79ab-4843-82bf-9c84336e3eff
2022/08/03 08:46:12 Staged binary artifact with token:
2022/08/03 08:46:13 Submitted job: load0tests0go0flink0batch0combine0100803065309-root-0803084613-eaa2eec3_ba581e41-5502-4cd1-a7e2-78ad0965240b
2022/08/03 08:46:13 Job state: STOPPED
2022/08/03 08:46:13 Job state: STARTING
2022/08/03 08:46:13 Job state: RUNNING
2022/08/03 08:47:22 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/03 08:47:22 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/03 08:47:22 Job state: FAILED
2022/08/03 08:47:22 Failed to execute job: job load0tests0go0flink0batch0combine0100803065309-root-0803084613-eaa2eec3_ba581e41-5502-4cd1-a7e2-78ad0965240b failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100803065309-root-0803084613-eaa2eec3_ba581e41-5502-4cd1-a7e2-78ad0965240b failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15c2728, 0xc00004a0c0}, {0x1430a35?, 0x1f14ef0?}, {0xc0006b1e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 39s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/4d5habssgfbuw
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #607
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/607/display/redirect?page=changes>
Changes:
[noreply] Exclude grpcio==1.48.0 (#22539)
[noreply] Merge PR #22304 fixing #22331 fixing JDBC IO IT
[noreply] Update pytest to support Python 3.10 (#22055)
[noreply] Update the imprecise link. (#22549)
------------------------------------------
[...truncated 33.49 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/02 08:43:15 Using specified **** binary: 'linux_amd64/combine'
2022/08/02 08:43:15 Prepared job with id: load-tests-go-flink-batch-combine-1-0802065311_b0ef1b69-e2bc-4e28-b966-9ffa929da48d and staging token: load-tests-go-flink-batch-combine-1-0802065311_b0ef1b69-e2bc-4e28-b966-9ffa929da48d
2022/08/02 08:43:20 Staged binary artifact with token:
2022/08/02 08:43:21 Submitted job: load0tests0go0flink0batch0combine0100802065311-root-0802084320-850b41fc_4b0bc4c2-a640-47e1-995e-f75b72ba2eb6
2022/08/02 08:43:21 Job state: STOPPED
2022/08/02 08:43:21 Job state: STARTING
2022/08/02 08:43:21 Job state: RUNNING
2022/08/02 08:44:30 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/02 08:44:30 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/02 08:44:30 Job state: FAILED
2022/08/02 08:44:30 Failed to execute job: job load0tests0go0flink0batch0combine0100802065311-root-0802084320-850b41fc_4b0bc4c2-a640-47e1-995e-f75b72ba2eb6 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100802065311-root-0802084320-850b41fc_4b0bc4c2-a640-47e1-995e-f75b72ba2eb6 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15c2688, 0xc00004a0c0}, {0x1430a35?, 0x1f14ef0?}, {0xc00064de70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 27s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/jobxttshir4xs
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #606
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/606/display/redirect>
Changes:
------------------------------------------
[...truncated 33.59 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/08/01 08:43:09 Using specified **** binary: 'linux_amd64/combine'
2022/08/01 08:43:09 Prepared job with id: load-tests-go-flink-batch-combine-1-0801065311_4fffba60-4bfc-481b-a6c2-5355e62c2634 and staging token: load-tests-go-flink-batch-combine-1-0801065311_4fffba60-4bfc-481b-a6c2-5355e62c2634
2022/08/01 08:43:14 Staged binary artifact with token:
2022/08/01 08:43:15 Submitted job: load0tests0go0flink0batch0combine0100801065311-root-0801084314-9d8c1b63_29dde55b-bc70-4648-97d4-dbd4aa6b3150
2022/08/01 08:43:15 Job state: STOPPED
2022/08/01 08:43:15 Job state: STARTING
2022/08/01 08:43:15 Job state: RUNNING
2022/08/01 08:44:23 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/08/01 08:44:23 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/08/01 08:44:23 Job state: FAILED
2022/08/01 08:44:23 Failed to execute job: job load0tests0go0flink0batch0combine0100801065311-root-0801084314-9d8c1b63_29dde55b-bc70-4648-97d4-dbd4aa6b3150 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100801065311-root-0801084314-9d8c1b63_29dde55b-bc70-4648-97d4-dbd4aa6b3150 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15c2688, 0xc00012e000}, {0x1430a35?, 0x1f14ef0?}, {0xc000309e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 31s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/jvcf77e4mlgoc
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #605
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/605/display/redirect>
Changes:
------------------------------------------
[...truncated 33.58 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/31 08:43:15 Using specified **** binary: 'linux_amd64/combine'
2022/07/31 08:43:15 Prepared job with id: load-tests-go-flink-batch-combine-1-0731065307_e2a48069-0717-4566-9ef6-dba7d071431e and staging token: load-tests-go-flink-batch-combine-1-0731065307_e2a48069-0717-4566-9ef6-dba7d071431e
2022/07/31 08:43:19 Staged binary artifact with token:
2022/07/31 08:43:21 Submitted job: load0tests0go0flink0batch0combine0100731065307-root-0731084320-e11c6f71_7dc31eda-4152-48d4-98e9-6169e137d524
2022/07/31 08:43:21 Job state: STOPPED
2022/07/31 08:43:21 Job state: STARTING
2022/07/31 08:43:21 Job state: RUNNING
2022/07/31 08:44:30 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/31 08:44:30 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/31 08:44:30 Job state: FAILED
2022/07/31 08:44:30 Failed to execute job: job load0tests0go0flink0batch0combine0100731065307-root-0731084320-e11c6f71_7dc31eda-4152-48d4-98e9-6169e137d524 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100731065307-root-0731084320-e11c6f71_7dc31eda-4152-48d4-98e9-6169e137d524 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15c2688, 0xc00004a0c0}, {0x1430a35?, 0x1f14ef0?}, {0xc00032be70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 34s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/hqv52wqs5sanq
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #604
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/604/display/redirect?page=changes>
Changes:
[noreply] Bump google.golang.org/protobuf from 1.28.0 to 1.28.1 in /sdks (#22517)
[noreply] Bump google.golang.org/api from 0.89.0 to 0.90.0 in /sdks (#22518)
[noreply] Change _build import from setuptools to distutils (#22503)
[noreply] Remove stringx package (#22534)
[noreply] Improve concrete error message (#22536)
------------------------------------------
[...truncated 33.60 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/30 08:43:38 Using specified **** binary: 'linux_amd64/combine'
2022/07/30 08:43:38 Prepared job with id: load-tests-go-flink-batch-combine-1-0730065310_d868646b-f02a-481b-a8fd-88b4c03edf38 and staging token: load-tests-go-flink-batch-combine-1-0730065310_d868646b-f02a-481b-a8fd-88b4c03edf38
2022/07/30 08:43:42 Staged binary artifact with token:
2022/07/30 08:43:43 Submitted job: load0tests0go0flink0batch0combine0100730065310-root-0730084342-2280ac49_11b46ee3-685d-4cf2-9369-1317208d5d16
2022/07/30 08:43:43 Job state: STOPPED
2022/07/30 08:43:43 Job state: STARTING
2022/07/30 08:43:43 Job state: RUNNING
2022/07/30 08:44:52 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/30 08:44:52 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/30 08:44:52 Job state: FAILED
2022/07/30 08:44:52 Failed to execute job: job load0tests0go0flink0batch0combine0100730065310-root-0730084342-2280ac49_11b46ee3-685d-4cf2-9369-1317208d5d16 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100730065310-root-0730084342-2280ac49_11b46ee3-685d-4cf2-9369-1317208d5d16 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15c2688, 0xc00004a0c0}, {0x1430a35?, 0x1f14ef0?}, {0xc000163e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/uxpurydaugg5m
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #603
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/603/display/redirect?page=changes>
Changes:
[chamikaramj] Remove unnecessary reference to use_runner_v2 experiment in x-lang
[bulat.safiullin] [Website] remove beam-summit 2022 container with all related files
[yixiaoshen] Fix typo in Datastore V1ReadIT test
[noreply] Add read/write PubSub integration example fhirio pipeline (#22306)
[noreply] Remove deprecated Session runner (#22505)
[noreply] Add Go test status to the PR template (#22508)
[noreply] Relax the google-api-core dependency. (#22513)
------------------------------------------
[...truncated 33.64 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/29 08:43:23 Using specified **** binary: 'linux_amd64/combine'
2022/07/29 08:43:23 Prepared job with id: load-tests-go-flink-batch-combine-1-0729065312_5a423148-3762-41fb-ab18-39f5e1755825 and staging token: load-tests-go-flink-batch-combine-1-0729065312_5a423148-3762-41fb-ab18-39f5e1755825
2022/07/29 08:43:28 Staged binary artifact with token:
2022/07/29 08:43:29 Submitted job: load0tests0go0flink0batch0combine0100729065312-root-0729084328-85707629_e9c59091-b7b5-4901-bf4b-cabecb91ae60
2022/07/29 08:43:29 Job state: STOPPED
2022/07/29 08:43:29 Job state: STARTING
2022/07/29 08:43:29 Job state: RUNNING
2022/07/29 08:44:38 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/29 08:44:38 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/29 08:44:38 Job state: FAILED
2022/07/29 08:44:38 Failed to execute job: job load0tests0go0flink0batch0combine0100729065312-root-0729084328-85707629_e9c59091-b7b5-4901-bf4b-cabecb91ae60 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100729065312-root-0729084328-85707629_e9c59091-b7b5-4901-bf4b-cabecb91ae60 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15c26c8, 0xc00004a0c0}, {0x1430a47?, 0x1f14ef0?}, {0xc0002ffe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 36s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/47ud6yvqta2q4
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #602
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/602/display/redirect?page=changes>
Changes:
[samuelw] Fixes #22438. Ensure that WindmillStateReader completes all batched read
[noreply] 21730 fix offset resetting (#22450)
[noreply] Bump google.golang.org/api from 0.88.0 to 0.89.0 in /sdks (#22464)
[noreply] Upgrades pip before installing Beam for Python default expansion service
[noreply] [Go SDK]: Plumb allowed lateness to execution (#22476)
[Valentyn Tymofieiev] Restrict google-api-core
[Valentyn Tymofieiev] Regenerate the container dependencies.
[noreply] Replace distutils with supported modules. (#22456)
[noreply] [22369] Default Metrics for Executable Stages in Samza Runner (#22370)
[Kiley Sok] Moving to 2.42.0-SNAPSHOT on master branch.
[noreply] Remove stripping of step name. Replace removing only suffix step name
------------------------------------------
[...truncated 33.67 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/28 08:43:37 Using specified **** binary: 'linux_amd64/combine'
2022/07/28 08:43:38 Prepared job with id: load-tests-go-flink-batch-combine-1-0728065319_c150abff-930e-4335-869f-4ba4475dcd40 and staging token: load-tests-go-flink-batch-combine-1-0728065319_c150abff-930e-4335-869f-4ba4475dcd40
2022/07/28 08:43:42 Staged binary artifact with token:
2022/07/28 08:43:43 Submitted job: load0tests0go0flink0batch0combine0100728065319-root-0728084342-92ee9a7a_b59e7c20-dfd6-4959-b6fd-2cc780aabd5e
2022/07/28 08:43:43 Job state: STOPPED
2022/07/28 08:43:43 Job state: STARTING
2022/07/28 08:43:43 Job state: RUNNING
2022/07/28 08:44:53 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/28 08:44:53 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/28 08:44:53 Job state: FAILED
2022/07/28 08:44:53 Failed to execute job: job load0tests0go0flink0batch0combine0100728065319-root-0728084342-92ee9a7a_b59e7c20-dfd6-4959-b6fd-2cc780aabd5e failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100728065319-root-0728084342-92ee9a7a_b59e7c20-dfd6-4959-b6fd-2cc780aabd5e failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15c26c8, 0xc00012e000}, {0x1430a47?, 0x1f14ef0?}, {0xc0006c9e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/2cplf6kifiww6
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #601
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/601/display/redirect?page=changes>
Changes:
[bulat.safiullin] add executeAsTemplate to head, head_homepage, add absURL to page-nav.js,
[chamikaramj] Adds KV support for the Java RunInference transform.
[noreply] Replace distutils with supported modules. (#21968)
[noreply] Revert "Replace distutils with supported modules. " (#22453)
[noreply] Enable configuration to avoid successfully written Table Row propagation
[noreply] lint fixes for recent import (#22455)
[noreply] Bump Python Combine LoadTests timeout to 12 hours (#22439)
[noreply] convert windmill min timestamp to beam min timestamp (#21915)
[noreply] [CdapIO] Fixed necessary warnings (#22399)
[noreply] [#22051]: Add read_time support to Google Cloud Datastore connector
------------------------------------------
[...truncated 33.70 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/27 08:43:25 Using specified **** binary: 'linux_amd64/combine'
2022/07/27 08:43:26 Prepared job with id: load-tests-go-flink-batch-combine-1-0727065313_6779b2cb-6c2b-4b51-81c3-68de7541f558 and staging token: load-tests-go-flink-batch-combine-1-0727065313_6779b2cb-6c2b-4b51-81c3-68de7541f558
2022/07/27 08:43:30 Staged binary artifact with token:
2022/07/27 08:43:31 Submitted job: load0tests0go0flink0batch0combine0100727065313-root-0727084330-a8d49ed6_ab07706e-d37c-497a-8406-49e8d5b25472
2022/07/27 08:43:31 Job state: STOPPED
2022/07/27 08:43:31 Job state: STARTING
2022/07/27 08:43:31 Job state: RUNNING
2022/07/27 08:44:41 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/27 08:44:41 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/27 08:44:41 Job state: FAILED
2022/07/27 08:44:41 Failed to execute job: job load0tests0go0flink0batch0combine0100727065313-root-0727084330-a8d49ed6_ab07706e-d37c-497a-8406-49e8d5b25472 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100727065313-root-0727084330-a8d49ed6_ab07706e-d37c-497a-8406-49e8d5b25472 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15c2668, 0xc0001a6000}, {0x1430a47?, 0x1f14ef0?}, {0xc0005c9e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 36s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/4nr6a774nwpia
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #600
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/600/display/redirect?page=changes>
Changes:
[Steve Niemitz] Fix overly aggressive null check in RowWriterFactory
[noreply] Bump cloud.google.com/go/bigquery from 1.35.0 to 1.36.0 in /sdks
[noreply] Disallow EventTimes in iterators (#22435)
[noreply] Update the upper bound for google-cloud-recommendations-ai. (#22398)
[noreply] LoadTestsBuilder: Disallow whitespace in option values (#22437)
------------------------------------------
[...truncated 33.74 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/26 08:43:51 Using specified **** binary: 'linux_amd64/combine'
2022/07/26 08:43:52 Prepared job with id: load-tests-go-flink-batch-combine-1-0726065314_e890627a-c60a-48d4-82c7-da876a85281b and staging token: load-tests-go-flink-batch-combine-1-0726065314_e890627a-c60a-48d4-82c7-da876a85281b
2022/07/26 08:43:56 Staged binary artifact with token:
2022/07/26 08:43:57 Submitted job: load0tests0go0flink0batch0combine0100726065314-root-0726084356-d7bbdb95_2ec08680-6972-4aa6-ab32-c8c9e4d54ee1
2022/07/26 08:43:57 Job state: STOPPED
2022/07/26 08:43:57 Job state: STARTING
2022/07/26 08:43:57 Job state: RUNNING
2022/07/26 08:45:06 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/26 08:45:06 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/26 08:45:06 Job state: FAILED
2022/07/26 08:45:06 Failed to execute job: job load0tests0go0flink0batch0combine0100726065314-root-0726084356-d7bbdb95_2ec08680-6972-4aa6-ab32-c8c9e4d54ee1 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100726065314-root-0726084356-d7bbdb95_2ec08680-6972-4aa6-ab32-c8c9e4d54ee1 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15c2668, 0xc00004a0c0}, {0x1430a47?, 0x1f14ef0?}, {0xc000027e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 49s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/rszv34xutbp3c
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #599
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/599/display/redirect>
Changes:
------------------------------------------
[...truncated 33.55 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/25 08:43:08 Using specified **** binary: 'linux_amd64/combine'
2022/07/25 08:43:09 Prepared job with id: load-tests-go-flink-batch-combine-1-0725065308_f7ef385c-c7c1-4265-bbb3-53929fdd8636 and staging token: load-tests-go-flink-batch-combine-1-0725065308_f7ef385c-c7c1-4265-bbb3-53929fdd8636
2022/07/25 08:43:13 Staged binary artifact with token:
2022/07/25 08:43:14 Submitted job: load0tests0go0flink0batch0combine0100725065308-root-0725084313-ba3de35e_b0fa8ee6-0a87-436d-b54c-3ef5f37a56ae
2022/07/25 08:43:15 Job state: STOPPED
2022/07/25 08:43:15 Job state: STARTING
2022/07/25 08:43:15 Job state: RUNNING
2022/07/25 08:44:23 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/25 08:44:23 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/25 08:44:23 Job state: FAILED
2022/07/25 08:44:23 Failed to execute job: job load0tests0go0flink0batch0combine0100725065308-root-0725084313-ba3de35e_b0fa8ee6-0a87-436d-b54c-3ef5f37a56ae failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100725065308-root-0725084313-ba3de35e_b0fa8ee6-0a87-436d-b54c-3ef5f37a56ae failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x164fae8, 0xc0001a8000}, {0x14ba0a7?, 0x1ff1ed0?}, {0xc000153e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 32s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/q2u3o6uvduwde
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #598
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/598/display/redirect>
Changes:
------------------------------------------
[...truncated 33.73 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/24 08:43:31 Using specified **** binary: 'linux_amd64/combine'
2022/07/24 08:43:32 Prepared job with id: load-tests-go-flink-batch-combine-1-0724065314_c6c874bb-1084-4992-a026-22a9b8e32f37 and staging token: load-tests-go-flink-batch-combine-1-0724065314_c6c874bb-1084-4992-a026-22a9b8e32f37
2022/07/24 08:43:36 Staged binary artifact with token:
2022/07/24 08:43:37 Submitted job: load0tests0go0flink0batch0combine0100724065314-root-0724084336-95b09732_eff40b24-91ee-4070-8321-4d93ac5d0764
2022/07/24 08:43:37 Job state: STOPPED
2022/07/24 08:43:37 Job state: STARTING
2022/07/24 08:43:37 Job state: RUNNING
2022/07/24 08:44:46 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/24 08:44:46 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/24 08:44:47 Job state: FAILED
2022/07/24 08:44:47 Failed to execute job: job load0tests0go0flink0batch0combine0100724065314-root-0724084336-95b09732_eff40b24-91ee-4070-8321-4d93ac5d0764 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100724065314-root-0724084336-95b09732_eff40b24-91ee-4070-8321-4d93ac5d0764 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x164fae8, 0xc00012e000}, {0x14ba0a7?, 0x1ff1ed0?}, {0xc0003e3e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 32s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/2uxjuvaxfn7vg
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #597
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/597/display/redirect?page=changes>
Changes:
[bulat.safiullin] [Website] change getting window width method
[Moritz Mack] Closes #22407: Separate sources for SparkStructuredStreamingRunner for
[Moritz Mack] Add deprecation warning for Spark 2 in SparkStructuredStreamingRunner
[noreply] Bump cloud.google.com/go/storage from 1.23.0 to 1.24.0 in /sdks (#22377)
[Pablo Estrada] Removing experimental annotation from JdbcIO
[noreply] Drop timeseries:postCommit dependency (#22414)
[noreply] Deduplicate identical environments in a pipeline. (#22308)
[noreply] Skip failing torch post commit test (#22418)
[noreply] Log level fix on local runner (#22420)
[noreply] Update element_type inference (default_type_hints) for batched DoFns
[noreply] Remove spaces in experiments (#22423)
------------------------------------------
[...truncated 33.65 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/23 08:43:39 Using specified **** binary: 'linux_amd64/combine'
2022/07/23 08:43:39 Prepared job with id: load-tests-go-flink-batch-combine-1-0723065307_29817c5e-4030-4c44-b804-096643273485 and staging token: load-tests-go-flink-batch-combine-1-0723065307_29817c5e-4030-4c44-b804-096643273485
2022/07/23 08:43:43 Staged binary artifact with token:
2022/07/23 08:43:44 Submitted job: load0tests0go0flink0batch0combine0100723065307-root-0723084343-2bd8a2f6_759ca93f-6fa3-4075-bbc9-a536ac5eaadb
2022/07/23 08:43:44 Job state: STOPPED
2022/07/23 08:43:44 Job state: STARTING
2022/07/23 08:43:44 Job state: RUNNING
2022/07/23 08:44:53 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/23 08:44:53 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/23 08:44:54 Job state: FAILED
2022/07/23 08:44:54 Failed to execute job: job load0tests0go0flink0batch0combine0100723065307-root-0723084343-2bd8a2f6_759ca93f-6fa3-4075-bbc9-a536ac5eaadb failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100723065307-root-0723084343-2bd8a2f6_759ca93f-6fa3-4075-bbc9-a536ac5eaadb failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x164fae8, 0xc00012e000}, {0x14ba0a7?, 0x1ff1ed0?}, {0xc00043fe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/u4f7bhuzknwto
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #596
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/596/display/redirect?page=changes>
Changes:
[balazs.nemeth] BEAM-14525 Fix for Protobuf getter/setter method name discovery issue
[balazs.nemeth] BEAM-14525 Added a proto message with the problematic properties to use
[balazs.nemeth] PR CR: updating issue links
[noreply] added olehborysevych as collaborator (#22391)
[noreply] Add accept-language header for MPL license (#22395)
[noreply] Bump terser from 5.9.0 to 5.14.2 in
[noreply] Fixes #22156: Fix Spark3 runner to compile against Spark 3.2/3.3 and add
------------------------------------------
[...truncated 33.63 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/22 08:43:17 Using specified **** binary: 'linux_amd64/combine'
2022/07/22 08:43:17 Prepared job with id: load-tests-go-flink-batch-combine-1-0722065315_83765ea9-b2b3-4aec-9dbc-d311405ac77c and staging token: load-tests-go-flink-batch-combine-1-0722065315_83765ea9-b2b3-4aec-9dbc-d311405ac77c
2022/07/22 08:43:21 Staged binary artifact with token:
2022/07/22 08:43:22 Submitted job: load0tests0go0flink0batch0combine0100722065315-root-0722084322-d19b290b_9df34944-a29f-433b-9462-19dd246a9acc
2022/07/22 08:43:22 Job state: STOPPED
2022/07/22 08:43:22 Job state: STARTING
2022/07/22 08:43:22 Job state: RUNNING
2022/07/22 08:44:31 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/22 08:44:31 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/22 08:44:31 Job state: FAILED
2022/07/22 08:44:31 Failed to execute job: job load0tests0go0flink0batch0combine0100722065315-root-0722084322-d19b290b_9df34944-a29f-433b-9462-19dd246a9acc failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100722065315-root-0722084322-d19b290b_9df34944-a29f-433b-9462-19dd246a9acc failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1651e68, 0xc00004a0c0}, {0x14bc54e?, 0x1ff8bb8?}, {0xc000235e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 35s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/yqmytgyk4pqg6
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #595
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/595/display/redirect?page=changes>
Changes:
[Robert Bradshaw] Support combiner lifting.
[noreply] Bump google.golang.org/api from 0.87.0 to 0.88.0 in /sdks (#22350)
[Robert Bradshaw] More clarification.
[noreply] [CdapIO] HasOffset interface was implemented (#22193)
------------------------------------------
[...truncated 33.59 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/21 08:43:29 Using specified **** binary: 'linux_amd64/combine'
2022/07/21 08:43:30 Prepared job with id: load-tests-go-flink-batch-combine-1-0721065312_be6c4b2a-840f-41e5-af6d-4ff3e27abb74 and staging token: load-tests-go-flink-batch-combine-1-0721065312_be6c4b2a-840f-41e5-af6d-4ff3e27abb74
2022/07/21 08:43:34 Staged binary artifact with token:
2022/07/21 08:43:35 Submitted job: load0tests0go0flink0batch0combine0100721065312-root-0721084334-cd0325cd_94f169f5-706a-4742-96f3-f33f0d736fb0
2022/07/21 08:43:35 Job state: STOPPED
2022/07/21 08:43:35 Job state: STARTING
2022/07/21 08:43:35 Job state: RUNNING
2022/07/21 08:44:45 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/21 08:44:45 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/21 08:44:45 Job state: FAILED
2022/07/21 08:44:45 Failed to execute job: job load0tests0go0flink0batch0combine0100721065312-root-0721084334-cd0325cd_94f169f5-706a-4742-96f3-f33f0d736fb0 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100721065312-root-0721084334-cd0325cd_94f169f5-706a-4742-96f3-f33f0d736fb0 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1651e68, 0xc00012e000}, {0x14bc54e?, 0x1ff8bb8?}, {0xc00056be70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 27s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/b267rf4fvh5qm
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #594
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/594/display/redirect?page=changes>
Changes:
[Robert Bradshaw] Require unique names for stages.
[noreply] Add links to the new RunInference content to Learning Resources (#22325)
[noreply] Unskip RunInference IT tests (#22324)
[noreply] cleaned up types in standard_coders.ts (#22316)
[noreply] JMH module for sdks:java:core with benchmarks for
[noreply] Bump cloud.google.com/go/pubsub from 1.23.1 to 1.24.0 in /sdks (#22332)
[Luke Cwik] [#22181] Fix java package for SDK java core benchmark
[Luke Cwik] Allow jmhTest to run concurrently with other jmhTest instances
[noreply] [BEAM-13015, #21250] Optimize encoding to a ByteString (#22345)
------------------------------------------
[...truncated 33.59 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/20 08:43:30 Using specified **** binary: 'linux_amd64/combine'
2022/07/20 08:43:30 Prepared job with id: load-tests-go-flink-batch-combine-1-0720065313_f5f107b0-5025-4344-9531-34af90836d80 and staging token: load-tests-go-flink-batch-combine-1-0720065313_f5f107b0-5025-4344-9531-34af90836d80
2022/07/20 08:43:34 Staged binary artifact with token:
2022/07/20 08:43:35 Submitted job: load0tests0go0flink0batch0combine0100720065313-root-0720084334-52c24d56_8e12ab78-8137-4336-ac54-cd2bb6150a06
2022/07/20 08:43:35 Job state: STOPPED
2022/07/20 08:43:35 Job state: STARTING
2022/07/20 08:43:35 Job state: RUNNING
2022/07/20 08:44:44 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/20 08:44:44 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/20 08:44:44 Job state: FAILED
2022/07/20 08:44:44 Failed to execute job: job load0tests0go0flink0batch0combine0100720065313-root-0720084334-52c24d56_8e12ab78-8137-4336-ac54-cd2bb6150a06 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100720065313-root-0720084334-52c24d56_8e12ab78-8137-4336-ac54-cd2bb6150a06 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1651e68, 0xc0001a8000}, {0x14bc54e?, 0x1ff8bb8?}, {0xc00030fe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 39s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/xikm3m6bbta2w
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #593
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/593/display/redirect?page=changes>
Changes:
[noreply] [BEAM-14117] Unvendor bytebuddy dependency (#17317)
[noreply] Use npm ci instead of install in CI (#22323)
[noreply] Fix typo in use_single_core_per_container logic. (#22318)
[noreply] [#22319] Regenerate proto2_coder_test_messages_pb2.py manually (#22320)
------------------------------------------
[...truncated 33.62 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/19 08:43:20 Using specified **** binary: 'linux_amd64/combine'
2022/07/19 08:43:20 Prepared job with id: load-tests-go-flink-batch-combine-1-0719065313_400bcf94-0722-428e-80ca-0bc2bd027c21 and staging token: load-tests-go-flink-batch-combine-1-0719065313_400bcf94-0722-428e-80ca-0bc2bd027c21
2022/07/19 08:43:24 Staged binary artifact with token:
2022/07/19 08:43:25 Submitted job: load0tests0go0flink0batch0combine0100719065313-root-0719084324-4ddd2b14_14122768-b5cf-4c98-95a0-2263f1006aa5
2022/07/19 08:43:25 Job state: STOPPED
2022/07/19 08:43:25 Job state: STARTING
2022/07/19 08:43:25 Job state: RUNNING
2022/07/19 08:44:34 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/19 08:44:34 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/19 08:44:35 Job state: FAILED
2022/07/19 08:44:35 Failed to execute job: job load0tests0go0flink0batch0combine0100719065313-root-0719084324-4ddd2b14_14122768-b5cf-4c98-95a0-2263f1006aa5 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100719065313-root-0719084324-4ddd2b14_14122768-b5cf-4c98-95a0-2263f1006aa5 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1651e68, 0xc00012e000}, {0x14bc54e?, 0x1ff8bb8?}, {0xc000311e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 34s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/geeldomsscjr6
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #592
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/592/display/redirect?page=changes>
Changes:
[Alexey Romanenko] [website] Add TPC-DS benchmark documentation
[noreply] Increase streaming server timeout (#22280)
------------------------------------------
[...truncated 33.64 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/18 08:43:22 Using specified **** binary: 'linux_amd64/combine'
2022/07/18 08:43:22 Prepared job with id: load-tests-go-flink-batch-combine-1-0718065322_f8c8e1df-8ab5-444c-9f26-174e0bae7eeb and staging token: load-tests-go-flink-batch-combine-1-0718065322_f8c8e1df-8ab5-444c-9f26-174e0bae7eeb
2022/07/18 08:43:27 Staged binary artifact with token:
2022/07/18 08:43:28 Submitted job: load0tests0go0flink0batch0combine0100718065322-root-0718084327-ebc4a1ed_3e92960e-3836-44b7-9de4-8c11987f0fd4
2022/07/18 08:43:28 Job state: STOPPED
2022/07/18 08:43:28 Job state: STARTING
2022/07/18 08:43:28 Job state: RUNNING
2022/07/18 08:44:37 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/18 08:44:37 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/18 08:44:37 Job state: FAILED
2022/07/18 08:44:37 Failed to execute job: job load0tests0go0flink0batch0combine0100718065322-root-0718084327-ebc4a1ed_3e92960e-3836-44b7-9de4-8c11987f0fd4 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100718065322-root-0718084327-ebc4a1ed_3e92960e-3836-44b7-9de4-8c11987f0fd4 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1651e68, 0xc00004a0c0}, {0x14bc54e?, 0x1ff8bb8?}, {0xc000441e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 36s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/dxwdfh7zfjxva
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #591
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/591/display/redirect?page=changes>
Changes:
[vlad.matyunin] enabled multifile flag for multifile examples (PG)
[noreply] Merge pull request #22300 from Fixed [Playground] DeployExamples,
------------------------------------------
[...truncated 33.76 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/17 08:43:35 Using specified **** binary: 'linux_amd64/combine'
2022/07/17 08:43:35 Prepared job with id: load-tests-go-flink-batch-combine-1-0717065309_e38e49f5-ee2d-4fa3-be2e-2eea2c22105d and staging token: load-tests-go-flink-batch-combine-1-0717065309_e38e49f5-ee2d-4fa3-be2e-2eea2c22105d
2022/07/17 08:43:40 Staged binary artifact with token:
2022/07/17 08:43:41 Submitted job: load0tests0go0flink0batch0combine0100717065309-root-0717084340-28469755_7da3ca50-bb58-482e-ac6a-6d54df9bb4d5
2022/07/17 08:43:41 Job state: STOPPED
2022/07/17 08:43:41 Job state: STARTING
2022/07/17 08:43:41 Job state: RUNNING
2022/07/17 08:44:50 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/17 08:44:50 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/17 08:44:50 Job state: FAILED
2022/07/17 08:44:50 Failed to execute job: job load0tests0go0flink0batch0combine0100717065309-root-0717084340-28469755_7da3ca50-bb58-482e-ac6a-6d54df9bb4d5 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100717065309-root-0717084340-28469755_7da3ca50-bb58-482e-ac6a-6d54df9bb4d5 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1651e68, 0xc00004a0c0}, {0x14bc54e?, 0x1ff8bb8?}, {0xc0003b5e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 35s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/ru2bkjguqrltw
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #590
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/590/display/redirect?page=changes>
Changes:
[vitaly.terentyev] [BEAM-14101] Add Spark Receiver IO package and ReceiverBuilder
[noreply] Bump protobufjs from 6.11.2 to 6.11.3 in /sdks/typescript
[egalpin] Moves timestamp skew override to correct place
[egalpin] Adds TestStream to verify window preservation of ElasticsearchIO#write
[egalpin] Removes unnecessary line
[egalpin] Adds validation that ES#Write outputs are in expected windows
[egalpin] Updates window verification test to assert the exact docs in the window
[egalpin] Uses guava Iterables over shaded avro version
[Robert Bradshaw] Don't try to parse non-flags as retained pipeline options.
[chamikaramj] Enables UnboundedSource wrapped SDF Kafka source by default for x-lang
[noreply] Merge pull request #22140 from [Playground Task] Sharing any code API
[bulat.safiullin] [Website] add playground section, update playground, update get-started
[noreply] RunInference documentation updates. (#22236)
[noreply] Turn pr bot on for remaining common labels (#22257)
[noreply] Reviewing the RunInference ReadMe file for clarity. (#22069)
[noreply] Collect heap profile on OOM on Dataflow (#22225)
[noreply] fixing the missing wrap around ring range read (#21786)
[noreply] Update RunInference documentation (#22250)
[noreply] Rewrote Java multi-language pipeline quickstart (#22263)
------------------------------------------
[...truncated 33.55 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/16 08:43:18 Using specified **** binary: 'linux_amd64/combine'
2022/07/16 08:43:18 Prepared job with id: load-tests-go-flink-batch-combine-1-0716065313_d321cd1f-57c0-4d5e-b456-7ad87976435f and staging token: load-tests-go-flink-batch-combine-1-0716065313_d321cd1f-57c0-4d5e-b456-7ad87976435f
2022/07/16 08:43:22 Staged binary artifact with token:
2022/07/16 08:43:23 Submitted job: load0tests0go0flink0batch0combine0100716065313-root-0716084322-c867566d_5e38d5ef-fffc-411d-a496-e55fc5215205
2022/07/16 08:43:23 Job state: STOPPED
2022/07/16 08:43:23 Job state: STARTING
2022/07/16 08:43:23 Job state: RUNNING
2022/07/16 08:44:32 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/16 08:44:32 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/16 08:44:32 Job state: FAILED
2022/07/16 08:44:32 Failed to execute job: job load0tests0go0flink0batch0combine0100716065313-root-0716084322-c867566d_5e38d5ef-fffc-411d-a496-e55fc5215205 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100716065313-root-0716084322-c867566d_5e38d5ef-fffc-411d-a496-e55fc5215205 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1651e68, 0xc00012e000}, {0x14bc54e?, 0x1ff8bb8?}, {0xc0005e9e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 36s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/6soyr26zzp2e2
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #589
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/589/display/redirect?page=changes>
Changes:
[Heejong Lee] [BEAM-22229] Override external SDK container URLs for Dataflow by
[danthev] Fix query retry in Java FirestoreIO.
[noreply] Split words on new lines or spaces (#22270)
[noreply] Replace \r\n, not just \n
[noreply] Pg auth test (#22277)
[noreply] [BEAM-14073] [CdapIO] CDAP IO for batch plugins: Read, Write. Unit tests
[Heejong Lee] update
[noreply] [Fix #22151] Add fhirio.Deidentify transform (#22152)
[noreply] Remove locks around ExecutionStateSampler (#22190)
------------------------------------------
[...truncated 33.61 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/15 08:43:29 Using specified **** binary: 'linux_amd64/combine'
2022/07/15 08:43:29 Prepared job with id: load-tests-go-flink-batch-combine-1-0715065320_cb93db5e-13f8-42e8-81a8-6162b527cc30 and staging token: load-tests-go-flink-batch-combine-1-0715065320_cb93db5e-13f8-42e8-81a8-6162b527cc30
2022/07/15 08:43:34 Staged binary artifact with token:
2022/07/15 08:43:35 Submitted job: load0tests0go0flink0batch0combine0100715065320-root-0715084334-451e1bb1_1ca49540-0cdb-481d-9884-81b83fdac483
2022/07/15 08:43:35 Job state: STOPPED
2022/07/15 08:43:35 Job state: STARTING
2022/07/15 08:43:35 Job state: RUNNING
2022/07/15 08:44:44 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/15 08:44:44 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/15 08:44:44 Job state: FAILED
2022/07/15 08:44:44 Failed to execute job: job load0tests0go0flink0batch0combine0100715065320-root-0715084334-451e1bb1_1ca49540-0cdb-481d-9884-81b83fdac483 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100715065320-root-0715084334-451e1bb1_1ca49540-0cdb-481d-9884-81b83fdac483 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1651ae8, 0xc00012e000}, {0x14bc3e1?, 0x1ff7ad8?}, {0xc0005e3e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 38s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/h2jg6idyrpj72
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #588
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/588/display/redirect?page=changes>
Changes:
[Heejong Lee] [BEAM-14506] Adding testcases and examples for xlang Python RunInference
[Heejong Lee] update
[Heejong Lee] update
[noreply] Move Go Primitives Integration Tests to Generic Registration (#22247)
[noreply] Move native Go examples to generic registration (#22245)
[noreply] Move youngoli to the reviewer exclusion list (#22195)
[noreply] Bump google.golang.org/api from 0.86.0 to 0.87.0 in /sdks (#22253)
[noreply] Bump cloud.google.com/go/bigquery from 1.34.1 to 1.35.0 in /sdks
[noreply] Bump google.golang.org/grpc from 1.47.0 to 1.48.0 in /sdks (#22252)
[noreply] Merge pull request #15786: Add gap-filling transform for timeseries
[chamikaramj] Adds an experiment that allows opting into using Kafka SDF-wrapper
[noreply] Defocus iframe on blur or mouseout (#22153) (#22154)
[noreply] Fix pydoc rendering for annotated classes (#22121)
[noreply] Fix typo in comment (#22266)
------------------------------------------
[...truncated 33.50 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/14 08:43:17 Using specified **** binary: 'linux_amd64/combine'
2022/07/14 08:43:17 Prepared job with id: load-tests-go-flink-batch-combine-1-0714065321_aaec5918-43a1-405e-900d-216003cfe3df and staging token: load-tests-go-flink-batch-combine-1-0714065321_aaec5918-43a1-405e-900d-216003cfe3df
2022/07/14 08:43:22 Staged binary artifact with token:
2022/07/14 08:43:24 Submitted job: load0tests0go0flink0batch0combine0100714065321-root-0714084323-96719a5b_59be939f-3edf-4489-a95f-faadf05c0ee0
2022/07/14 08:43:24 Job state: STOPPED
2022/07/14 08:43:24 Job state: STARTING
2022/07/14 08:43:24 Job state: RUNNING
2022/07/14 08:44:32 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/14 08:44:32 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/14 08:44:32 Job state: FAILED
2022/07/14 08:44:32 Failed to execute job: job load0tests0go0flink0batch0combine0100714065321-root-0714084323-96719a5b_59be939f-3edf-4489-a95f-faadf05c0ee0 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100714065321-root-0714084323-96719a5b_59be939f-3edf-4489-a95f-faadf05c0ee0 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1651ae8, 0xc000136000}, {0x14bc3e1?, 0x1ff7ad8?}, {0xc000643e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 36s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/vtjy5wx3drnpo
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #587
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/587/display/redirect?page=changes>
Changes:
[naireenhussain] add new pubsub urn
[Pablo Estrada] Several requests to show experiments in Dataflow UI
[byronellis] Add org.pentaho to calcite relocated packages to fix vendoring
[noreply] Adding VladMatyunin as collaborator (#22239)
[noreply] Mark session runner as deprecated (#22242)
[noreply] Update google-cloud-core dependency to <3 (#22237)
[noreply] Move WC integration test to generic registration (#22248)
[noreply] Move Xlang Go examples to generic registration (#22249)
------------------------------------------
[...truncated 33.71 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/13 08:43:43 Using specified **** binary: 'linux_amd64/combine'
2022/07/13 08:43:43 Prepared job with id: load-tests-go-flink-batch-combine-1-0713065331_b778e9e1-517c-4b18-9e1f-1d121dddefe8 and staging token: load-tests-go-flink-batch-combine-1-0713065331_b778e9e1-517c-4b18-9e1f-1d121dddefe8
2022/07/13 08:43:48 Staged binary artifact with token:
2022/07/13 08:43:49 Submitted job: load0tests0go0flink0batch0combine0100713065331-root-0713084348-db7a98cf_804cba49-935e-4b64-93a1-0a2a09aa392a
2022/07/13 08:43:49 Job state: STOPPED
2022/07/13 08:43:49 Job state: STARTING
2022/07/13 08:43:49 Job state: RUNNING
2022/07/13 08:44:59 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/13 08:44:59 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/13 08:44:59 Job state: FAILED
2022/07/13 08:44:59 Failed to execute job: job load0tests0go0flink0batch0combine0100713065331-root-0713084348-db7a98cf_804cba49-935e-4b64-93a1-0a2a09aa392a failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100713065331-root-0713084348-db7a98cf_804cba49-935e-4b64-93a1-0a2a09aa392a failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x164f4c8, 0xc00004a0c0}, {0x14ba11b?, 0x1ff4a58?}, {0xc00023de70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/ansqghy44xi6m
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #586
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/586/display/redirect?page=changes>
Changes:
[noreply] Split checkStyle from precommit into spotless job (#22203)
[noreply] Allow one to bound the size of output shards when writing to files.
[noreply] Bump moment from 2.29.2 to 2.29.4 in
[noreply] Allow BigQuery TableIds to have space in between (#22167)
[noreply] Use async as a suffix rather than a prefix for asynchronous variants.
[noreply] Override log levels after log handler is created (#22191)
[noreply] Remove deprecated unused option in seed job script (#22223)
[noreply] Better error for external BigQuery tables. (#22178)
[noreply] Try to fix playground workflow (#22226)
------------------------------------------
[...truncated 33.63 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/12 08:43:37 Using specified **** binary: 'linux_amd64/combine'
2022/07/12 08:43:37 Prepared job with id: load-tests-go-flink-batch-combine-1-0712065338_4f54aeb7-dd24-4e5c-95c5-6fcf272d3caa and staging token: load-tests-go-flink-batch-combine-1-0712065338_4f54aeb7-dd24-4e5c-95c5-6fcf272d3caa
2022/07/12 08:43:41 Staged binary artifact with token:
2022/07/12 08:43:42 Submitted job: load0tests0go0flink0batch0combine0100712065338-root-0712084341-6c50d204_6187f883-acb6-4a52-a904-46a45cb7a6a4
2022/07/12 08:43:42 Job state: STOPPED
2022/07/12 08:43:42 Job state: STARTING
2022/07/12 08:43:42 Job state: RUNNING
2022/07/12 08:44:51 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/12 08:44:51 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/12 08:44:51 Job state: FAILED
2022/07/12 08:44:51 Failed to execute job: job load0tests0go0flink0batch0combine0100712065338-root-0712084341-6c50d204_6187f883-acb6-4a52-a904-46a45cb7a6a4 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100712065338-root-0712084341-6c50d204_6187f883-acb6-4a52-a904-46a45cb7a6a4 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x164f4c8, 0xc00012e000}, {0x14ba11b?, 0x1ff4a58?}, {0xc000365e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/oaurevuudvfrg
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #585
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/585/display/redirect?page=changes>
Changes:
[noreply] [Website] Update minimum required Go version for sdk development
[noreply] Parallelizable DataFrame/Series mean (#22174)
------------------------------------------
[...truncated 33.57 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/11 08:43:18 Using specified **** binary: 'linux_amd64/combine'
2022/07/11 08:43:18 Prepared job with id: load-tests-go-flink-batch-combine-1-0706185312_a2945c32-4d27-42d2-a766-dc1f4aa91014 and staging token: load-tests-go-flink-batch-combine-1-0706185312_a2945c32-4d27-42d2-a766-dc1f4aa91014
2022/07/11 08:43:22 Staged binary artifact with token:
2022/07/11 08:43:24 Submitted job: load0tests0go0flink0batch0combine0100706185312-root-0711084323-3e769809_29fd02bc-1ffb-4af2-b89d-7ec4ced51051
2022/07/11 08:43:24 Job state: STOPPED
2022/07/11 08:43:24 Job state: STARTING
2022/07/11 08:43:24 Job state: RUNNING
2022/07/11 08:44:33 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/11 08:44:33 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/11 08:44:33 Job state: FAILED
2022/07/11 08:44:33 Failed to execute job: job load0tests0go0flink0batch0combine0100706185312-root-0711084323-3e769809_29fd02bc-1ffb-4af2-b89d-7ec4ced51051 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100706185312-root-0711084323-3e769809_29fd02bc-1ffb-4af2-b89d-7ec4ced51051 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x164f4c8, 0xc00012e000}, {0x14ba11b?, 0x1ff4a58?}, {0xc000661e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 33s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/uigjdupkmwjsa
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #584
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/584/display/redirect?page=changes>
Changes:
[noreply] Add typescript documentation to the programing guide. (#22137)
------------------------------------------
[...truncated 33.65 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/10 08:43:17 Using specified **** binary: 'linux_amd64/combine'
2022/07/10 08:43:17 Prepared job with id: load-tests-go-flink-batch-combine-1-0706185312_a5ad9d8d-f46a-4c99-b1a2-0c68fd2b5e24 and staging token: load-tests-go-flink-batch-combine-1-0706185312_a5ad9d8d-f46a-4c99-b1a2-0c68fd2b5e24
2022/07/10 08:43:24 Staged binary artifact with token:
2022/07/10 08:43:25 Submitted job: load0tests0go0flink0batch0combine0100706185312-root-0710084324-842f5776_12a78ba1-3b2b-4c60-926f-5b62d7667479
2022/07/10 08:43:25 Job state: STOPPED
2022/07/10 08:43:25 Job state: STARTING
2022/07/10 08:43:25 Job state: RUNNING
2022/07/10 08:44:34 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/10 08:44:34 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/10 08:44:35 Job state: FAILED
2022/07/10 08:44:35 Failed to execute job: job load0tests0go0flink0batch0combine0100706185312-root-0710084324-842f5776_12a78ba1-3b2b-4c60-926f-5b62d7667479 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100706185312-root-0710084324-842f5776_12a78ba1-3b2b-4c60-926f-5b62d7667479 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x164f4c8, 0xc00004a0c0}, {0x14ba11b?, 0x1ff4a58?}, {0xc000175e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 36s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/qfr4zxeqr7x6m
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #583
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/583/display/redirect?page=changes>
Changes:
[bulat.safiullin] [Website] change case studies link from staging to relative path
[bulat.safiullin] [Website] add I/O Connectors link to dropdown list, updating link to
[noreply] Update Go BPG xlang documentation to include Java automated service
[noreply] Merge pull request #22096 from [Playground] Infrastructure for sharing
[noreply] Support dependencies and remote registration in the typescript SDK.
[noreply] [BEAM-13015, #22050] Make SDK harness msec counters faster using ordered
[yathu] Fix build error due to dep confliction of google-cloud-bigquery-storage
[yathu] Fix atomicwrites old version purge on pypi
[noreply] Fix default type inference of CombinePerKey. (#16351)
------------------------------------------
[...truncated 33.62 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/09 08:43:26 Using specified **** binary: 'linux_amd64/combine'
2022/07/09 08:43:26 Prepared job with id: load-tests-go-flink-batch-combine-1-0706185312_c8f22d80-39f1-461c-807b-527f5a21f60a and staging token: load-tests-go-flink-batch-combine-1-0706185312_c8f22d80-39f1-461c-807b-527f5a21f60a
2022/07/09 08:43:31 Staged binary artifact with token:
2022/07/09 08:43:32 Submitted job: load0tests0go0flink0batch0combine0100706185312-root-0709084331-90096912_647204a2-c7bf-4fe1-a4f2-e7163bdd30e5
2022/07/09 08:43:32 Job state: STOPPED
2022/07/09 08:43:32 Job state: STARTING
2022/07/09 08:43:32 Job state: RUNNING
2022/07/09 08:44:41 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/09 08:44:41 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/09 08:44:41 Job state: FAILED
2022/07/09 08:44:41 Failed to execute job: job load0tests0go0flink0batch0combine0100706185312-root-0709084331-90096912_647204a2-c7bf-4fe1-a4f2-e7163bdd30e5 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100706185312-root-0709084331-90096912_647204a2-c7bf-4fe1-a4f2-e7163bdd30e5 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x164f4c8, 0xc000134000}, {0x14ba11b?, 0x1ff4a58?}, {0xc0003afe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/pvk23xdds623g
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #582
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/582/display/redirect?page=changes>
Changes:
[bulat.safiullin] [Website] add refresh to page-nav.js
[relax] set timestamp when outputting finalize element
[alexey.inkin] Declarative theming, Remove duplicate PlaygroundState for embedded page,
[yathu] Fix Hadoop upload corrupted due to buffer reuse
[noreply] Propogate error messages from GcsUtil (#22079)
[noreply] Reenable Jenkins comment triggers (#22169)
[benjamin.gonzalez] Fix testKafkaIOReadsAndWritesCorrectlyInStreaming failing for kafka
[noreply] Add `schema_options` and `field_options` on RowTypeConstraint (#22133)
[noreply] Optimize locking in several critical-path methods (#22162)
[noreply] Deprecate AWS IOs (Java) using AWS SDK v1 in favor of IOs in
------------------------------------------
[...truncated 33.63 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/08 08:43:24 Using specified **** binary: 'linux_amd64/combine'
2022/07/08 08:43:25 Prepared job with id: load-tests-go-flink-batch-combine-1-0706185312_de1e16b5-4a64-4064-951d-6a09899c1ddd and staging token: load-tests-go-flink-batch-combine-1-0706185312_de1e16b5-4a64-4064-951d-6a09899c1ddd
2022/07/08 08:43:29 Staged binary artifact with token:
2022/07/08 08:43:30 Submitted job: load0tests0go0flink0batch0combine0100706185312-root-0708084329-a1e557f8_5740bd62-2968-4756-a99a-0655bc2a633c
2022/07/08 08:43:30 Job state: STOPPED
2022/07/08 08:43:30 Job state: STARTING
2022/07/08 08:43:30 Job state: RUNNING
2022/07/08 08:44:39 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/08 08:44:39 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/08 08:44:39 Job state: FAILED
2022/07/08 08:44:39 Failed to execute job: job load0tests0go0flink0batch0combine0100706185312-root-0708084329-a1e557f8_5740bd62-2968-4756-a99a-0655bc2a633c failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100706185312-root-0708084329-a1e557f8_5740bd62-2968-4756-a99a-0655bc2a633c failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x164f4c8, 0xc000136000}, {0x14ba11b?, 0x1ff4a58?}, {0xc00072fe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 35s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/mgsv7z4axuvpo
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #581
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/581/display/redirect?page=changes>
Changes:
[noreply] [BEAM-11103] Add blog post for go 2.40 release (#17723)
[noreply] Fix test_row_coder_fail_early_bad_schema fails run after
[noreply] Tune ByteStringCoder allocations (#22144)
[noreply] Enable passing tests on dataflow runner v2. (#22136)
[noreply] Merge pull request #17727 from [BEAM-9482] Fix "provided port is already
[noreply] Fix date for go 2.40 blog post
[noreply] Fix month for 2.40 go blog post
[noreply] [BEAM-14545] Optimize copies in dataflow v1 shuffle reader. (#17802)
[noreply] Tune StreamingModeExecutionContext allocations (#22142)
[noreply] [BEAM-3221] Improve documentation around split request and response
[noreply] Fix documentation about hand implemented global aggregations (#22173)
[noreply] Merge pull request #21872 from Standardizing output of WriteToBigQuery
------------------------------------------
[...truncated 33.49 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/07 08:43:12 Using specified **** binary: 'linux_amd64/combine'
2022/07/07 08:43:12 Prepared job with id: load-tests-go-flink-batch-combine-1-0706185312_4d8da62f-04e8-49b6-91d1-9ab6a869725c and staging token: load-tests-go-flink-batch-combine-1-0706185312_4d8da62f-04e8-49b6-91d1-9ab6a869725c
2022/07/07 08:43:16 Staged binary artifact with token:
2022/07/07 08:43:17 Submitted job: load0tests0go0flink0batch0combine0100706185312-root-0707084316-4814cb50_2c39391d-b5d5-4f23-83c0-ede84ef54ed5
2022/07/07 08:43:17 Job state: STOPPED
2022/07/07 08:43:17 Job state: STARTING
2022/07/07 08:43:17 Job state: RUNNING
2022/07/07 08:44:27 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/07 08:44:27 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/07 08:44:27 Job state: FAILED
2022/07/07 08:44:27 Failed to execute job: job load0tests0go0flink0batch0combine0100706185312-root-0707084316-4814cb50_2c39391d-b5d5-4f23-83c0-ede84ef54ed5 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100706185312-root-0707084316-4814cb50_2c39391d-b5d5-4f23-83c0-ede84ef54ed5 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x164f4c8, 0xc00012e000}, {0x14ba11b?, 0x1ff4a58?}, {0xc000177e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 26s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/njeshohusdk34
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #580
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/580/display/redirect>
Changes:
------------------------------------------
[...truncated 33.58 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/06 08:43:12 Using specified **** binary: 'linux_amd64/combine'
2022/07/06 08:43:12 Prepared job with id: load-tests-go-flink-batch-combine-1-0706065308_4eff9fa7-1f84-4af0-93c5-49e6901934bb and staging token: load-tests-go-flink-batch-combine-1-0706065308_4eff9fa7-1f84-4af0-93c5-49e6901934bb
2022/07/06 08:43:16 Staged binary artifact with token:
2022/07/06 08:43:17 Submitted job: load0tests0go0flink0batch0combine0100706065308-root-0706084316-b5a7b0e4_1f2c50b8-0c36-4653-b8e0-d98a3fa47d06
2022/07/06 08:43:17 Job state: STOPPED
2022/07/06 08:43:17 Job state: STARTING
2022/07/06 08:43:17 Job state: RUNNING
2022/07/06 08:44:26 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/06 08:44:26 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/06 08:44:26 Job state: FAILED
2022/07/06 08:44:26 Failed to execute job: job load0tests0go0flink0batch0combine0100706065308-root-0706084316-b5a7b0e4_1f2c50b8-0c36-4653-b8e0-d98a3fa47d06 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100706065308-root-0706084316-b5a7b0e4_1f2c50b8-0c36-4653-b8e0-d98a3fa47d06 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x164f4c8, 0xc00012e000}, {0x14ba11b?, 0x1ff4a58?}, {0xc000315e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 35s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/77vgq3im2fkli
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #579
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/579/display/redirect>
Changes:
------------------------------------------
[...truncated 33.70 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/05 08:43:32 Using specified **** binary: 'linux_amd64/combine'
2022/07/05 08:43:32 Prepared job with id: load-tests-go-flink-batch-combine-1-0705065315_5ed82f66-723d-4533-b3ad-b58ef33b5e86 and staging token: load-tests-go-flink-batch-combine-1-0705065315_5ed82f66-723d-4533-b3ad-b58ef33b5e86
2022/07/05 08:43:36 Staged binary artifact with token:
2022/07/05 08:43:37 Submitted job: load0tests0go0flink0batch0combine0100705065315-root-0705084337-3f8bf8b4_ed5ddf38-6872-436f-9a19-39157a6d250f
2022/07/05 08:43:37 Job state: STOPPED
2022/07/05 08:43:37 Job state: STARTING
2022/07/05 08:43:37 Job state: RUNNING
2022/07/05 08:44:46 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/05 08:44:46 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/05 08:44:46 Job state: FAILED
2022/07/05 08:44:46 Failed to execute job: job load0tests0go0flink0batch0combine0100705065315-root-0705084337-3f8bf8b4_ed5ddf38-6872-436f-9a19-39157a6d250f failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100705065315-root-0705084337-3f8bf8b4_ed5ddf38-6872-436f-9a19-39157a6d250f failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x164f4c8, 0xc00012e000}, {0x14ba11b?, 0x1ff4a58?}, {0xc0003a7e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 31s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/ufvyuoeyaeebk
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #578
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/578/display/redirect?page=changes>
Changes:
[noreply] Go SDK: Update memfs to parse the List() pattern as a glob, not a regexp
[noreply] Bump cloud.google.com/go/pubsub from 1.23.0 to 1.23.1 in /sdks (#22122)
------------------------------------------
[...truncated 33.64 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/04 08:43:11 Using specified **** binary: 'linux_amd64/combine'
2022/07/04 08:43:11 Prepared job with id: load-tests-go-flink-batch-combine-1-0704065318_5287b8ed-72b5-4881-9329-249ad1533329 and staging token: load-tests-go-flink-batch-combine-1-0704065318_5287b8ed-72b5-4881-9329-249ad1533329
2022/07/04 08:43:15 Staged binary artifact with token:
2022/07/04 08:43:16 Submitted job: load0tests0go0flink0batch0combine0100704065318-root-0704084316-38b8c2a6_23cbafb4-9d5d-4690-b8ca-6616251d0fad
2022/07/04 08:43:16 Job state: STOPPED
2022/07/04 08:43:16 Job state: STARTING
2022/07/04 08:43:16 Job state: RUNNING
2022/07/04 08:44:25 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/04 08:44:25 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/04 08:44:25 Job state: FAILED
2022/07/04 08:44:25 Failed to execute job: job load0tests0go0flink0batch0combine0100704065318-root-0704084316-38b8c2a6_23cbafb4-9d5d-4690-b8ca-6616251d0fad failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100704065318-root-0704084316-38b8c2a6_23cbafb4-9d5d-4690-b8ca-6616251d0fad failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x164f4c8, 0xc00012e000}, {0x14ba11b?, 0x1ff4a58?}, {0xc0000d1e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 35s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/u7nslp3whw3ka
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #577
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/577/display/redirect?page=changes>
Changes:
[noreply] Sharding IO tests(amazon web services and amazon web services 2) from
------------------------------------------
[...truncated 33.77 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/03 08:43:41 Using specified **** binary: 'linux_amd64/combine'
2022/07/03 08:43:42 Prepared job with id: load-tests-go-flink-batch-combine-1-0703065314_2b65d7f0-f0b0-46a6-ba6c-3436815c5c35 and staging token: load-tests-go-flink-batch-combine-1-0703065314_2b65d7f0-f0b0-46a6-ba6c-3436815c5c35
2022/07/03 08:43:46 Staged binary artifact with token:
2022/07/03 08:43:47 Submitted job: load0tests0go0flink0batch0combine0100703065314-root-0703084346-2c0f7801_9971d011-4060-4995-b482-7748957d7e68
2022/07/03 08:43:47 Job state: STOPPED
2022/07/03 08:43:47 Job state: STARTING
2022/07/03 08:43:47 Job state: RUNNING
2022/07/03 08:44:56 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/03 08:44:56 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/03 08:44:56 Job state: FAILED
2022/07/03 08:44:56 Failed to execute job: job load0tests0go0flink0batch0combine0100703065314-root-0703084346-2c0f7801_9971d011-4060-4995-b482-7748957d7e68 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100703065314-root-0703084346-2c0f7801_9971d011-4060-4995-b482-7748957d7e68 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x164f4c8, 0xc00012e000}, {0x14ba11b?, 0x1ff4a58?}, {0xc000651e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/amcjl4dn4t2fy
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #576
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/576/display/redirect?page=changes>
Changes:
[Moritz Mack] Deprecate runner support for Spark 2.4 (closes #22094)
[noreply] Python: Use RowTypeConstraint for normalizing all schema-inferrable user
[noreply] changing nameBase value to Java_GCP_IO_Direct (#22128)
[noreply] Bump dataflow fnapi java sdk version (#22127)
------------------------------------------
[...truncated 33.58 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/02 08:43:21 Using specified **** binary: 'linux_amd64/combine'
2022/07/02 08:43:22 Prepared job with id: load-tests-go-flink-batch-combine-1-0702065312_356888fc-1671-4b84-b495-f154e93f7d7a and staging token: load-tests-go-flink-batch-combine-1-0702065312_356888fc-1671-4b84-b495-f154e93f7d7a
2022/07/02 08:43:26 Staged binary artifact with token:
2022/07/02 08:43:27 Submitted job: load0tests0go0flink0batch0combine0100702065312-root-0702084326-be104de2_633853c7-797b-429b-8c8a-49849c637882
2022/07/02 08:43:27 Job state: STOPPED
2022/07/02 08:43:27 Job state: STARTING
2022/07/02 08:43:27 Job state: RUNNING
2022/07/02 08:44:35 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/02 08:44:35 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/02 08:44:35 Job state: FAILED
2022/07/02 08:44:35 Failed to execute job: job load0tests0go0flink0batch0combine0100702065312-root-0702084326-be104de2_633853c7-797b-429b-8c8a-49849c637882 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100702065312-root-0702084326-be104de2_633853c7-797b-429b-8c8a-49849c637882 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x164f4c8, 0xc000198000}, {0x14ba11b?, 0x1ff4a58?}, {0xc0003d3e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 35s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/qajdqbtyudu24
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #575
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/575/display/redirect?page=changes>
Changes:
[alexey.inkin] Do not re-create PlaygroundState (#21950)
[noreply] [BEAM-14187] Fix NPE at initializeForKeyedRead in IsmReaderImpl (#22111)
[noreply] Remove unused legacy dataflow translate code (#22019)
[noreply] Fixes #21698: Use normal Container snapshots for Go Load Tests (#22102)
[noreply] Change default, options, and explanation for issue priority (#22116)
[noreply] Minor: Bump flake8 to 4.0.1 (#22110)
[noreply] Add sdk_harness_log_level_overrides option for python sdk (#22077)
[noreply] Fix typo in Pytorch Bert Language Modeling (#22114)
[noreply] Fix #21977: Add Search transform to Go FhirIO (#21979)
------------------------------------------
[...truncated 33.58 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/01 08:43:20 Using specified **** binary: 'linux_amd64/combine'
2022/07/01 08:43:20 Prepared job with id: load-tests-go-flink-batch-combine-1-0701065313_7c5080c8-5a5d-42f5-92c5-1781cdbd8d18 and staging token: load-tests-go-flink-batch-combine-1-0701065313_7c5080c8-5a5d-42f5-92c5-1781cdbd8d18
2022/07/01 08:43:24 Staged binary artifact with token:
2022/07/01 08:43:25 Submitted job: load0tests0go0flink0batch0combine0100701065313-root-0701084324-af0d140b_e1f05b68-b97f-4d0f-837c-17d9e188f7c0
2022/07/01 08:43:25 Job state: STOPPED
2022/07/01 08:43:25 Job state: STARTING
2022/07/01 08:43:25 Job state: RUNNING
2022/07/01 08:44:34 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/01 08:44:34 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/01 08:44:34 Job state: FAILED
2022/07/01 08:44:34 Failed to execute job: job load0tests0go0flink0batch0combine0100701065313-root-0701084324-af0d140b_e1f05b68-b97f-4d0f-837c-17d9e188f7c0 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100701065313-root-0701084324-af0d140b_e1f05b68-b97f-4d0f-837c-17d9e188f7c0 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x164f4c8, 0xc00004a0c0}, {0x14ba11b?, 0x1ff4a58?}, {0xc000235e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 35s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/rmpdz622jx55e
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #574
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/574/display/redirect?page=changes>
Changes:
[Andrew Pilloud] Projection Pushdown optimizer on by default
[noreply] Test and fix FlatMap(<builtin>) issue (#22104)
[noreply] Fix InputStream on platform with 4 bytie long (#22107)
------------------------------------------
[...truncated 34.47 KB...]
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/30 08:43:14 Using specified **** binary: 'linux_amd64/combine'
2022/06/30 08:43:14 Prepared job with id: load-tests-go-flink-batch-combine-1-0630065311_34241ccf-8448-471f-82f7-9aa3246e6b5f and staging token: load-tests-go-flink-batch-combine-1-0630065311_34241ccf-8448-471f-82f7-9aa3246e6b5f
2022/06/30 08:43:18 Staged binary artifact with token:
2022/06/30 08:43:19 Submitted job: load0tests0go0flink0batch0combine0100630065311-root-0630084318-30f2260c_a083938b-c7fa-49e0-857b-121aaed4a61b
2022/06/30 08:43:19 Job state: STOPPED
2022/06/30 08:43:19 Job state: STARTING
2022/06/30 08:43:19 Job state: RUNNING
2022/06/30 08:43:38 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 9241863f857c1927a5a41273a4c43337)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readArray(ObjectInputStream.java:2119)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1657)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1446)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/30 08:43:38 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/30 08:43:39 Job state: FAILED
2022/06/30 08:43:39 Failed to execute job: job load0tests0go0flink0batch0combine0100630065311-root-0630084318-30f2260c_a083938b-c7fa-49e0-857b-121aaed4a61b failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100630065311-root-0630084318-30f2260c_a083938b-c7fa-49e0-857b-121aaed4a61b failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1655548, 0xc00012e000}, {0x14bf5f3?, 0x1ffca98?}, {0xc000373e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 44s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/andysnzckirpi
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #573
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/573/display/redirect?page=changes>
Changes:
[alexey.inkin] Add an abstract layer for analytics, fix logging change of snippet, fix
[damondouglas] Implement PubsubSchemaTransformMessageToFactory
[bulat.safiullin] [Website] add scroll-spy to body in case-studies/baseof.html
[noreply] Update issue bot to javascript and add label management (#22067)
[noreply] Clean up issue management doc page
[noreply] [BEAM-13015, #21250, fixes #22053] Improve PCollectionConsumerRegistry
[noreply] sharding GCP IO tests from the javaPostCommit task (#21800)
[noreply] Bump cloud.google.com/go/storage from 1.22.1 to 1.23.0 in /sdks (#22038)
[noreply] Followup sharding javaPostCommit (#22081)
[noreply] remove mention of dill in release notes as it's not relevant. (#22087)
[noreply] [#21634] Add comments on FieldValueGetter. (#21982)
[noreply] Bump google.golang.org/api from 0.85.0 to 0.86.0 in /sdks (#22092)
[noreply] [BEAM-6597] Replace ProgressRequestCallback with BundleProgressReporter
[noreply] [Go SDK] Go Lint fixes (#21967)
[noreply] Fix #21869: Close GRPC connections on cancel (#21874)
[noreply] Add FlatMap(<builtin>) known issue to 2.40.0 blog (#22101)
[noreply] [BEAM-14347] Update docs to prefer generic registration functions
[noreply] Merge pull request #21752 from Feature/beam 13852 reimplement with
[noreply] Change wording of Pytorch LM example (#22099)
[noreply] Fix missing model_params in Pytorch docstring (#22100)
------------------------------------------
[...truncated 33.92 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam_sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/29 21:15:58 Using specified **** binary: 'linux_amd64/combine'
2022/06/29 21:15:58 Prepared job with id: load-tests-go-flink-batch-combine-1-0629210425_74988cbb-b137-4456-8deb-9f1224240fd5 and staging token: load-tests-go-flink-batch-combine-1-0629210425_74988cbb-b137-4456-8deb-9f1224240fd5
2022/06/29 21:16:04 Staged binary artifact with token:
2022/06/29 21:16:05 Submitted job: load0tests0go0flink0batch0combine0100629210425-root-0629211604-7f1cd4ee_a86a9ad8-5637-4d0d-b37e-d47037c35e42
2022/06/29 21:16:05 Job state: STOPPED
2022/06/29 21:16:05 Job state: STARTING
2022/06/29 21:16:05 Job state: RUNNING
2022/06/29 21:17:17 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted.
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/06/29 21:17:17 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/06/29 21:17:17 Job state: FAILED
2022/06/29 21:17:17 Failed to execute job: job load0tests0go0flink0batch0combine0100629210425-root-0629211604-7f1cd4ee_a86a9ad8-5637-4d0d-b37e-d47037c35e42 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100629210425-root-0629211604-7f1cd4ee_a86a9ad8-5637-4d0d-b37e-d47037c35e42 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1655548, 0xc00012e000}, {0x14bf5f3?, 0x1ffca98?}, {0xc00039fe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 53s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/ei2kyzse3rk2w
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #572
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/572/display/redirect?page=changes>
Changes:
[Pablo Estrada] Blog post and updates for release 2.40.0
[noreply] 22011 remove checks on client.close() except when
[noreply] update flutter version to 3.0.1-stable (#22062)
[noreply] Add randomness to integration test job names to avoid collisions
[noreply] Give @pcoet triage permission (#22068)
[noreply] Issue#20877 Updated Interactive Beam README (#22034)
------------------------------------------
[...truncated 34.42 KB...]
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/28 08:43:14 Using specified **** binary: 'linux_amd64/combine'
2022/06/28 08:43:14 Prepared job with id: load-tests-go-flink-batch-combine-1-0628065310_f0f72f19-0055-4d70-b750-3931c652ab98 and staging token: load-tests-go-flink-batch-combine-1-0628065310_f0f72f19-0055-4d70-b750-3931c652ab98
2022/06/28 08:43:18 Staged binary artifact with token:
2022/06/28 08:43:19 Submitted job: load0tests0go0flink0batch0combine0100628065310-root-0628084318-d9ca63dc_9d8505c4-25b4-4374-9d4e-00e405a05a78
2022/06/28 08:43:19 Job state: STOPPED
2022/06/28 08:43:19 Job state: STARTING
2022/06/28 08:43:19 Job state: RUNNING
2022/06/28 08:43:37 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 08af1fd343ad8dab4921dbbca05fbe45)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor26.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1446)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/28 08:43:37 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/28 08:43:37 Job state: FAILED
2022/06/28 08:43:37 Failed to execute job: job load0tests0go0flink0batch0combine0100628065310-root-0628084318-d9ca63dc_9d8505c4-25b4-4374-9d4e-00e405a05a78 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100628065310-root-0628084318-d9ca63dc_9d8505c4-25b4-4374-9d4e-00e405a05a78 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1641aa8, 0xc0000480c0}, {0x14addbc?, 0x1fe0a98?}, {0xc0006e5e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 43s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/4mbe4w55jtvkk
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #571
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/571/display/redirect>
Changes:
------------------------------------------
[...truncated 34.36 KB...]
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/27 08:43:08 Using specified **** binary: 'linux_amd64/combine'
2022/06/27 08:43:09 Prepared job with id: load-tests-go-flink-batch-combine-1-0627065310_b3e53b81-4b92-4df0-bd5e-367d71196ef2 and staging token: load-tests-go-flink-batch-combine-1-0627065310_b3e53b81-4b92-4df0-bd5e-367d71196ef2
2022/06/27 08:43:13 Staged binary artifact with token:
2022/06/27 08:43:14 Submitted job: load0tests0go0flink0batch0combine0100627065310-root-0627084313-8dc83e1_44bcba8f-1e0c-4884-a505-fe5dccfde706
2022/06/27 08:43:14 Job state: STOPPED
2022/06/27 08:43:14 Job state: STARTING
2022/06/27 08:43:14 Job state: RUNNING
2022/06/27 08:43:34 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 7c62e82e879857d35f873d54f7a3d3d1)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor25.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getInputSerializer(TaskConfig.java:459)
at org.apache.flink.runtime.operators.DataSinkTask.initInputReaders(DataSinkTask.java:413)
at org.apache.flink.runtime.operators.DataSinkTask.invoke(DataSinkTask.java:117)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/27 08:43:34 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/27 08:43:34 Job state: FAILED
2022/06/27 08:43:34 Failed to execute job: job load0tests0go0flink0batch0combine0100627065310-root-0627084313-8dc83e1_44bcba8f-1e0c-4884-a505-fe5dccfde706 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100627065310-root-0627084313-8dc83e1_44bcba8f-1e0c-4884-a505-fe5dccfde706 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1641aa8, 0xc0000480c0}, {0x14addbc?, 0x1fe0a98?}, {0xc000295e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 41s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/wjee67lseergm
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #570
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/570/display/redirect>
Changes:
------------------------------------------
[...truncated 34.41 KB...]
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/26 08:43:01 Using specified **** binary: 'linux_amd64/combine'
2022/06/26 08:43:02 Prepared job with id: load-tests-go-flink-batch-combine-1-0626065309_2052b037-c91d-437e-88f6-99693a8ae506 and staging token: load-tests-go-flink-batch-combine-1-0626065309_2052b037-c91d-437e-88f6-99693a8ae506
2022/06/26 08:43:05 Staged binary artifact with token:
2022/06/26 08:43:06 Submitted job: load0tests0go0flink0batch0combine0100626065309-root-0626084306-312c1efd_59779a45-6574-4f78-8eac-5f7e077068ed
2022/06/26 08:43:06 Job state: STOPPED
2022/06/26 08:43:06 Job state: STARTING
2022/06/26 08:43:06 Job state: RUNNING
2022/06/26 08:43:24 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 56c3cda124ad72c670ce20b0aa725836)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor26.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1446)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/26 08:43:24 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/26 08:43:24 Job state: FAILED
2022/06/26 08:43:24 Failed to execute job: job load0tests0go0flink0batch0combine0100626065309-root-0626084306-312c1efd_59779a45-6574-4f78-8eac-5f7e077068ed failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100626065309-root-0626084306-312c1efd_59779a45-6574-4f78-8eac-5f7e077068ed failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1641aa8, 0xc00012e000}, {0x14addbc?, 0x1fe0a98?}, {0xc0003a1e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/uqoxfbagjtxfk
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #569
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/569/display/redirect?page=changes>
Changes:
[Robert Bradshaw] Use WindowedValue.withValue rather than WindowedValue.of in
[Robert Bradshaw] [BEAM-14464] More efficient grouping keys in precombiner table.
[Robert Bradshaw] fix compile after merge
[Robert Bradshaw] spotless
[Robert Bradshaw] Only flush every Nth element.
[Robert Bradshaw] spotless
[Robert Bradshaw] Post-merge fix.
[Robert Bradshaw] Fix test expectations.
[bulat.safiullin] [Website] add guard expressions to fix-menu and page-nav
[noreply] Enable close issue as not planned (#22032)
[noreply] Rename README.md to ACTIONS.md (#22043)
[noreply] Removes examples of unscalable sinks from documentation. (#22020)
[noreply] Unify to a single issue report (#22045)
[noreply] Remove colon in issue report
[noreply] Bump cloud.google.com/go/pubsub from 1.22.2 to 1.23.0 in /sdks (#22036)
[noreply] Fix vendored dependency issue and other style checks (#22046)
[noreply] Bump shell-quote (#21983)
[noreply] Revert "[BEAM-13590]Update Pytest version to support Python 3.10
[noreply] Bump cloud.google.com/go/bigquery from 1.32.0 to 1.34.1 in /sdks
[noreply] Bump github.com/spf13/cobra from 1.4.0 to 1.5.0 in /sdks (#21955)
[yathu] checkStlye Fix: remove redundant static and public in interface. camel
[noreply] Fix DEADLINE_EXCEEDED flakiness (#22035)
[noreply] Fix SpannerIO flakes (#22023)
------------------------------------------
[...truncated 34.43 KB...]
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/25 08:43:26 Using specified **** binary: 'linux_amd64/combine'
2022/06/25 08:43:26 Prepared job with id: load-tests-go-flink-batch-combine-1-0625065314_f4bb5b79-f6f9-4ff1-be61-63914f8c07ce and staging token: load-tests-go-flink-batch-combine-1-0625065314_f4bb5b79-f6f9-4ff1-be61-63914f8c07ce
2022/06/25 08:43:30 Staged binary artifact with token:
2022/06/25 08:43:31 Submitted job: load0tests0go0flink0batch0combine0100625065314-root-0625084330-d004b82a_8fceac43-b4d7-4333-b071-492fb395a059
2022/06/25 08:43:31 Job state: STOPPED
2022/06/25 08:43:31 Job state: STARTING
2022/06/25 08:43:31 Job state: RUNNING
2022/06/25 08:43:49 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: a8038539916f54cfe4bbf6591d2fd706)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor21.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1477)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/25 08:43:49 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/25 08:43:49 Job state: FAILED
2022/06/25 08:43:49 Failed to execute job: job load0tests0go0flink0batch0combine0100625065314-root-0625084330-d004b82a_8fceac43-b4d7-4333-b071-492fb395a059 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100625065314-root-0625084330-d004b82a_8fceac43-b4d7-4333-b071-492fb395a059 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1641aa8, 0xc00012e000}, {0x14addbc?, 0x1fe0a98?}, {0xc0005f5e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 51s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/55rdnt2hp2yeq
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #568
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/568/display/redirect?page=changes>
Changes:
[noreply] Canonicalize standard_coders.yaml booleans
[andyye333] Move wrapper class outside run()
[noreply] Fix issues with test ordering (#21986)
[noreply] Followup fix FileIOTest.testMatchWatchForNewFiles flaky (#21877)
[noreply] Fix links for issue report (#22033)
[noreply] Merge pull request #21953 from Implement
------------------------------------------
[...truncated 34.43 KB...]
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/24 08:43:15 Using specified **** binary: 'linux_amd64/combine'
2022/06/24 08:43:15 Prepared job with id: load-tests-go-flink-batch-combine-1-0624065311_bb3b7f60-5b62-495a-ab1c-e3918663c038 and staging token: load-tests-go-flink-batch-combine-1-0624065311_bb3b7f60-5b62-495a-ab1c-e3918663c038
2022/06/24 08:43:19 Staged binary artifact with token:
2022/06/24 08:43:20 Submitted job: load0tests0go0flink0batch0combine0100624065311-root-0624084319-84d54d8d_87b607e8-5431-4812-91b8-f6b41904733d
2022/06/24 08:43:20 Job state: STOPPED
2022/06/24 08:43:20 Job state: STARTING
2022/06/24 08:43:20 Job state: RUNNING
2022/06/24 08:43:38 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 45c409487f3e556c00f11d1e63a7a50f)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1446)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/24 08:43:38 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/24 08:43:38 Job state: FAILED
2022/06/24 08:43:38 Failed to execute job: job load0tests0go0flink0batch0combine0100624065311-root-0624084319-84d54d8d_87b607e8-5431-4812-91b8-f6b41904733d failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100624065311-root-0624084319-84d54d8d_87b607e8-5431-4812-91b8-f6b41904733d failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1641aa8, 0xc0000480c0}, {0x14addbc?, 0x1fe0a98?}, {0xc00032de70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 42s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/bxb2hmlo3bpiy
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #567
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/567/display/redirect?page=changes>
Changes:
[Robert Bradshaw] Streaming-related runner fixes.
[Robert Bradshaw] Improvements to auto-started services.
[Robert Bradshaw] Fix version, asserts for remote execution.
[Robert Bradshaw] Add IO dependencies.
[Robert Bradshaw] Add several cross-language IOs.
[Robert Bradshaw] Disable tests that require new release is required for out-of-the-box
[rszper] Correcting the regex for the Dataflow job name.
[noreply] Bump cloud.google.com/go/datastore from 1.6.0 to 1.8.0 in /sdks (#21973)
[noreply] Bump google.golang.org/api from 0.83.0 to 0.85.0 in /sdks (#21974)
[noreply] [Go SDK] Adds a snippet for GBK in BPG (#21842)
[noreply] Update parameterized requirement in /sdks/python (#21975)
[noreply] Merge pull request #21981 from [Playground] Upgrade Flutter linter, fix
[noreply] Clean up redundant articles, prepositions, conjunctions appeared
[noreply] Fix FlatMap numpy array bug (#22006)
[Robert Bradshaw] More strongly typed outputs.
------------------------------------------
[...truncated 34.59 KB...]
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/23 08:43:50 Using specified **** binary: 'linux_amd64/combine'
2022/06/23 08:43:51 Prepared job with id: load-tests-go-flink-batch-combine-1-0623065313_9840731b-1bb6-4224-9bf7-4d1fc3d9d881 and staging token: load-tests-go-flink-batch-combine-1-0623065313_9840731b-1bb6-4224-9bf7-4d1fc3d9d881
2022/06/23 08:43:55 Staged binary artifact with token:
2022/06/23 08:43:56 Submitted job: load0tests0go0flink0batch0combine0100623065313-root-0623084355-33f64bbd_54400a4e-46c7-431f-b69b-969c15455ab9
2022/06/23 08:43:56 Job state: STOPPED
2022/06/23 08:43:56 Job state: STARTING
2022/06/23 08:43:56 Job state: RUNNING
2022/06/23 08:44:18 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 35e6718740569fcaaef147cb469294bf)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
at java.util.ServiceLoader.fail(ServiceLoader.java:239)
at java.util.ServiceLoader.access$300(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:376)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at com.fasterxml.jackson.databind.ObjectMapper.findModules(ObjectMapper.java:1081)
at org.apache.beam.runners.core.construction.SerializablePipelineOptions.<clinit>(SerializablePipelineOptions.java:38)
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getInputSerializer(TaskConfig.java:459)
at org.apache.flink.runtime.operators.DataSinkTask.initInputReaders(DataSinkTask.java:413)
at org.apache.flink.runtime.operators.DataSinkTask.invoke(DataSinkTask.java:117)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/23 08:44:18 (): java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
2022/06/23 08:44:18 Job state: FAILED
2022/06/23 08:44:18 Failed to execute job: job load0tests0go0flink0batch0combine0100623065313-root-0623084355-33f64bbd_54400a4e-46c7-431f-b69b-969c15455ab9 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100623065313-root-0623084355-33f64bbd_54400a4e-46c7-431f-b69b-969c15455ab9 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1641aa8, 0xc00012e000}, {0x14addbc?, 0x1fe0a98?}, {0xc000265e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 55s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/ltqf4ssnhc2dw
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #566
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/566/display/redirect?page=changes>
Changes:
[yiru] fix: Add a retry code to insertall retry policy
[johnjcasey] 21742 add warning for risky kafka configuration
[johnjcasey] 21742 run spotless
[noreply] [BEAM-13590]Update Pytest version to support Python 3.10 (#17791)
[noreply] Fix target email for flaky test/p0/p1 reports
[noreply] Add unit testing for graphx/user.go (#21962)
[bulat.safiullin] [Website] add lyft to quote cards on homepage, use relative paths for
[noreply] Update documentations and document generation (#21965)
[noreply] Add ExecuteBundles transform to Go FhirIO (#21840)
------------------------------------------
[...truncated 34.29 KB...]
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/22 08:43:02 Using specified **** binary: 'linux_amd64/combine'
2022/06/22 08:43:03 Prepared job with id: load-tests-go-flink-batch-combine-1-0622065312_d9e4730f-caa5-4dd7-9280-1b14a4818986 and staging token: load-tests-go-flink-batch-combine-1-0622065312_d9e4730f-caa5-4dd7-9280-1b14a4818986
2022/06/22 08:43:06 Staged binary artifact with token:
2022/06/22 08:43:07 Submitted job: load0tests0go0flink0batch0combine0100622065312-root-0622084307-55c8aab5_c1a28e80-9178-4d0f-a3d4-9784cabdb3f7
2022/06/22 08:43:07 Job state: STOPPED
2022/06/22 08:43:07 Job state: STARTING
2022/06/22 08:43:07 Job state: RUNNING
2022/06/22 08:43:30 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: feb530b46ae7b8d1a822c7edc88e6dfa)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1477)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/22 08:43:30 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/22 08:43:30 Job state: FAILED
2022/06/22 08:43:30 Failed to execute job: job load0tests0go0flink0batch0combine0100622065312-root-0622084307-55c8aab5_c1a28e80-9178-4d0f-a3d4-9784cabdb3f7 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100622065312-root-0622084307-55c8aab5_c1a28e80-9178-4d0f-a3d4-9784cabdb3f7 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x160c288, 0xc0000480c0}, {0x147edbc?, 0x1f95248?}, {0xc00026fe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 37s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/yp44j74udh524
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #565
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/565/display/redirect?page=changes>
Changes:
[Alexey Romanenko] [BEAM-12918] Add PostCommit_Java_Tpcds_Flink job
[noreply] Modified KafkaIO.Read SDF->Legacy forced override to fail if configured
------------------------------------------
[...truncated 34.43 KB...]
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/21 08:43:13 Using specified **** binary: 'linux_amd64/combine'
2022/06/21 08:43:13 Prepared job with id: load-tests-go-flink-batch-combine-1-0621065311_46b87ec3-b488-45e6-96b5-89c5d858b48c and staging token: load-tests-go-flink-batch-combine-1-0621065311_46b87ec3-b488-45e6-96b5-89c5d858b48c
2022/06/21 08:43:17 Staged binary artifact with token:
2022/06/21 08:43:18 Submitted job: load0tests0go0flink0batch0combine0100621065311-root-0621084317-ad545e3c_bfe7e7b1-ace9-4cff-bab0-d65494e4621e
2022/06/21 08:43:18 Job state: STOPPED
2022/06/21 08:43:18 Job state: STARTING
2022/06/21 08:43:18 Job state: RUNNING
2022/06/21 08:43:36 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 5326ba3b8b328c31edbd11650de31efd)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor25.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.DataSourceTask.initInputFormat(DataSourceTask.java:324)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:106)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/21 08:43:36 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/21 08:43:36 Job state: FAILED
2022/06/21 08:43:36 Failed to execute job: job load0tests0go0flink0batch0combine0100621065311-root-0621084317-ad545e3c_bfe7e7b1-ace9-4cff-bab0-d65494e4621e failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100621065311-root-0621084317-ad545e3c_bfe7e7b1-ace9-4cff-bab0-d65494e4621e failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x160c288, 0xc00012e000}, {0x147edbc?, 0x1f95248?}, {0xc000657e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 46s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/h4by5yv3yydkc
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #564
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/564/display/redirect>
Changes:
------------------------------------------
[...truncated 34.39 KB...]
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/20 08:43:11 Using specified **** binary: 'linux_amd64/combine'
2022/06/20 08:43:11 Prepared job with id: load-tests-go-flink-batch-combine-1-0620065310_2ead5548-e069-436b-92bb-a301bf0cf663 and staging token: load-tests-go-flink-batch-combine-1-0620065310_2ead5548-e069-436b-92bb-a301bf0cf663
2022/06/20 08:43:15 Staged binary artifact with token:
2022/06/20 08:43:16 Submitted job: load0tests0go0flink0batch0combine0100620065310-root-0620084315-374b4966_96e2909f-9760-43b3-83ce-af63ab366f76
2022/06/20 08:43:16 Job state: STOPPED
2022/06/20 08:43:16 Job state: STARTING
2022/06/20 08:43:16 Job state: RUNNING
2022/06/20 08:43:38 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 88539f30f52a713c6a93f84606e13908)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.DataSourceTask.initInputFormat(DataSourceTask.java:324)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:106)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/20 08:43:38 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/20 08:43:38 Job state: FAILED
2022/06/20 08:43:38 Failed to execute job: job load0tests0go0flink0batch0combine0100620065310-root-0620084315-374b4966_96e2909f-9760-43b3-83ce-af63ab366f76 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100620065310-root-0620084315-374b4966_96e2909f-9760-43b3-83ce-af63ab366f76 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x160c288, 0xc0000480c0}, {0x147edbc?, 0x1f95248?}, {0xc000025e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 43s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/jtmpimghpjtks
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #563
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/563/display/redirect?page=changes>
Changes:
[Pablo Estrada] Removing playground from main page to remove scrolling issue
[noreply] Merge pull request #21940 from [21941] Fix no output timestamp case
------------------------------------------
[...truncated 34.53 KB...]
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/19 08:43:33 Using specified **** binary: 'linux_amd64/combine'
2022/06/19 08:43:34 Prepared job with id: load-tests-go-flink-batch-combine-1-0619065338_1f421441-a66c-4cd5-b1af-e80515df6285 and staging token: load-tests-go-flink-batch-combine-1-0619065338_1f421441-a66c-4cd5-b1af-e80515df6285
2022/06/19 08:43:37 Staged binary artifact with token:
2022/06/19 08:43:38 Submitted job: load0tests0go0flink0batch0combine0100619065338-root-0619084338-b4c5f167_99ca29ff-fcb1-4c77-ba90-e17df6d97a73
2022/06/19 08:43:38 Job state: STOPPED
2022/06/19 08:43:38 Job state: STARTING
2022/06/19 08:43:38 Job state: RUNNING
2022/06/19 08:43:56 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: ef215618648d9fb587b4d30ec2f7b22a)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor24.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1446)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/19 08:43:56 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/19 08:43:56 Job state: FAILED
2022/06/19 08:43:57 Failed to execute job: job load0tests0go0flink0batch0combine0100619065338-root-0619084338-b4c5f167_99ca29ff-fcb1-4c77-ba90-e17df6d97a73 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100619065338-root-0619084338-b4c5f167_99ca29ff-fcb1-4c77-ba90-e17df6d97a73 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x160c288, 0xc0000480c0}, {0x147edbc?, 0x1f95248?}, {0xc00026de70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 43s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/rxrbs5lm6n25w
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #562
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/562/display/redirect?page=changes>
Changes:
[yathu] Unsickbay copy_rewrite_token tests
[yathu] [BEAM-3177][BEAM-5468] Add pipeline options to set default logging level
[ahmedabualsaud] test fixes
[ahmedabualsaud] no need for this line
[Kenneth Knowles] Suppress unneeded spotbugs unused store warnings
[Kenneth Knowles] Eliminate nullness errors in KafkaIO
[yathu] Fix beam_PostCommit_Java_Sickbay build
[noreply] Expand pr bot to python (#21791)
[noreply] Update run inference documentation (#21921)
[noreply] Consider skipped checks successful (#21924)
[bulat.safiullin] [Website] add publishdate attribute to frontmatter
[noreply] Add guidance on self-assigning/closing to issue templates (#21931)
[noreply] Update names.py
[noreply] [Website] add new case-study, fix styles, add related images (#21891)
[noreply] Merge pull request #21928 from [Fixes #21927] Compress
[noreply] BigQueryIO: Adding the BASIC view setting to getTable request (#21879)
------------------------------------------
[...truncated 34.50 KB...]
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/18 08:43:28 Using specified **** binary: 'linux_amd64/combine'
2022/06/18 08:43:29 Prepared job with id: load-tests-go-flink-batch-combine-1-0618065323_62e209b8-2814-4386-9eb8-c591fdcbb2de and staging token: load-tests-go-flink-batch-combine-1-0618065323_62e209b8-2814-4386-9eb8-c591fdcbb2de
2022/06/18 08:43:32 Staged binary artifact with token:
2022/06/18 08:43:33 Submitted job: load0tests0go0flink0batch0combine0100618065323-root-0618084333-91959d14_f9fcb901-480d-46f1-a5b9-13b266c90052
2022/06/18 08:43:33 Job state: STOPPED
2022/06/18 08:43:33 Job state: STARTING
2022/06/18 08:43:33 Job state: RUNNING
2022/06/18 08:43:54 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: f9e993a9b547843373eee62a4e4a9279)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
at java.util.ServiceLoader.fail(ServiceLoader.java:239)
at java.util.ServiceLoader.access$300(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:376)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at com.fasterxml.jackson.databind.ObjectMapper.findModules(ObjectMapper.java:1081)
at org.apache.beam.runners.core.construction.SerializablePipelineOptions.<clinit>(SerializablePipelineOptions.java:38)
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getInputSerializer(TaskConfig.java:459)
at org.apache.flink.runtime.operators.DataSinkTask.initInputReaders(DataSinkTask.java:413)
at org.apache.flink.runtime.operators.DataSinkTask.invoke(DataSinkTask.java:117)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/18 08:43:54 (): java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
2022/06/18 08:43:54 Job state: FAILED
2022/06/18 08:43:54 Failed to execute job: job load0tests0go0flink0batch0combine0100618065323-root-0618084333-91959d14_f9fcb901-480d-46f1-a5b9-13b266c90052 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100618065323-root-0618084333-91959d14_f9fcb901-480d-46f1-a5b9-13b266c90052 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x160c288, 0xc0000480c0}, {0x147edbc?, 0x1f95248?}, {0xc000625e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 48s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/5y4ss3g2boxqi
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #561
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/561/display/redirect?page=changes>
Changes:
[egalpin] Drops usage of setWindowingStrategyInternal in favour of direct use of
[egalpin] Gives unique names to ES IO Write windowing
[Pablo Estrada] Update Python base image requirements
[Kenneth Knowles] Revert "convert windmill min timestamp to beam min timestamp"
[noreply] Fix a few small config issues (#21909)
[dannymccormick] Update py to python label
[noreply] Daily p0/p1/flaky reports for issues (#21725)
[noreply] Remove dataframe warnings from py38-docs logs (#21861)
[noreply] Update references to Jira to GH for the Java SDK (#21836)
[noreply] [21709] - Fix for "beam_PostCommit_Java_ValidatesRunner_Samza Failing"
[noreply] Update references to jira to GH for the Runners (#21835)
[noreply] Update remaining references to Jira to GH (#21834)
[Kenneth Knowles] Re-activate nullness checking for some of sdks/java/core/coders
------------------------------------------
[...truncated 34.45 KB...]
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/17 08:43:13 Using specified **** binary: 'linux_amd64/combine'
2022/06/17 08:43:13 Prepared job with id: load-tests-go-flink-batch-combine-1-0617065333_65817688-fe2c-417c-8e83-4289e5724860 and staging token: load-tests-go-flink-batch-combine-1-0617065333_65817688-fe2c-417c-8e83-4289e5724860
2022/06/17 08:43:17 Staged binary artifact with token:
2022/06/17 08:43:18 Submitted job: load0tests0go0flink0batch0combine0100617065333-root-0617084317-3cedf65d_76bdf5c9-ecb8-4fa8-bcb8-3364587034f1
2022/06/17 08:43:18 Job state: STOPPED
2022/06/17 08:43:18 Job state: STARTING
2022/06/17 08:43:18 Job state: RUNNING
2022/06/17 08:43:36 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: e64603eeedf558e498230864e3936064)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor25.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1477)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/17 08:43:36 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/17 08:43:36 Job state: FAILED
2022/06/17 08:43:36 Failed to execute job: job load0tests0go0flink0batch0combine0100617065333-root-0617084317-3cedf65d_76bdf5c9-ecb8-4fa8-bcb8-3364587034f1 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100617065333-root-0617084317-3cedf65d_76bdf5c9-ecb8-4fa8-bcb8-3364587034f1 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x160c288, 0xc00012e000}, {0x147edbc?, 0x1f95248?}, {0xc000669e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 42s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/mmre3pgab7f44
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #560
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/560/display/redirect?page=changes>
Changes:
[naireenhussain] convert windmill min timestamp to beam min timestamp
[dannymccormick] Mark issues as triaged when they are assigned
[nielm] Add Spanner Integration tests to verify exception handling
[bulat.safiullin] [BEAM-13229] side nav bug fixed
[bulat.safiullin] fix links for pipelines
[noreply] [BEAM-14524] Returning NamedTuple from RunInference transform (#17773)
[noreply] Unit tests for RunInference keyed/unkeyed Modelhandler and examples
[noreply] Remove kwargs and add explicit runinference_args (#21806)
[noreply] Modify README for 3 pytorch examples (#21871)
[noreply] Sickbay Pytorch example IT test (#21857)
[noreply] Add required=True to Pytorch image classification example (#21883)
[noreply] Switch go todos from issue # syntax to links (#21890)
[Valentyn Tymofieiev] Rollback dill.
[noreply] Add Pytorch image segmentation example (#21766)
[noreply] Add README documentation for scikit-learn MNIST example (#21887)
[noreply] Decompose labels for new issues (#21888)
[noreply] Use Go 1.18 for go-licenses (#21896)
[noreply] [BEAM-12903] Cron job to cleanup Dataproc leaked resources (#21779)
[noreply] [BEAM-7209][BEAM-9351][BEAM-9428] Upgrade Hive to version 3.1.3 (#17749)
[noreply] Sharding IO tests (Kafka, Debezium, JDBC, Kinesis, Neo4j) from the
[noreply] Merge pull request #17604 from [BEAM-14315] Match updated files
[noreply] Merge pull request #21781 from Sklearn Mnist example and IT test
[noreply] Get the latest version of go-licenses (#21901)
[noreply] Hide internal helpers added to DoFn for batched DoFns (#21860)
[noreply] Updated documentation for ml.inference docs. (#21868)
[Pablo Estrada] Moving to 2.41.0-SNAPSHOT on master branch.
[noreply] Add a type hint to nexmark query 3 joinFn (#21873)
------------------------------------------
[...truncated 34.60 KB...]
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/16 08:43:45 Using specified **** binary: 'linux_amd64/combine'
2022/06/16 08:43:46 Prepared job with id: load-tests-go-flink-batch-combine-1-0616065402_c15cbdea-c4e6-4e87-b4d1-51ceaf2a3a45 and staging token: load-tests-go-flink-batch-combine-1-0616065402_c15cbdea-c4e6-4e87-b4d1-51ceaf2a3a45
2022/06/16 08:43:50 Staged binary artifact with token:
2022/06/16 08:43:51 Submitted job: load0tests0go0flink0batch0combine0100616065402-root-0616084350-e2776261_b06a6cbe-123f-49cc-bbff-33297548fdd4
2022/06/16 08:43:51 Job state: STOPPED
2022/06/16 08:43:51 Job state: STARTING
2022/06/16 08:43:51 Job state: RUNNING
2022/06/16 08:44:09 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: d8dd8b1d72d2ad6211037f6add369b47)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor24.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readArray(ObjectInputStream.java:2119)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1657)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1446)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/16 08:44:09 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/16 08:44:09 Job state: FAILED
2022/06/16 08:44:09 Failed to execute job: job load0tests0go0flink0batch0combine0100616065402-root-0616084350-e2776261_b06a6cbe-123f-49cc-bbff-33297548fdd4 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100616065402-root-0616084350-e2776261_b06a6cbe-123f-49cc-bbff-33297548fdd4 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x160c288, 0xc0001a6000}, {0x147edbc?, 0x1f95248?}, {0xc000395e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 54s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/cdtmjw4fv2vok
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #559
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/559/display/redirect?page=changes>
Changes:
[nielm] Add transform names to help debug flaky test
[chamikaramj] Automatically enable Runner v2 for pipelines that use cross-language
[Jan Lukavský] [BEAM-14265] Add watermark hold for all timers
[noreply] Bump Python beam-master container (#21820)
[noreply] Split PytorchModelHandler into PytorchModelHandlerTensor and
[noreply] Fix Hadoop Downloader Range not correct (#21778)
[noreply] [BEAM-14036] Read Configuration for Pub/Sub SchemaTransform (#17730)
[noreply] [Go SDK] Add more info to Worker Status API (#21776)
[noreply] Make PeriodicImpulse generates unbounded PCollection (#21815)
[noreply] [BEAM-14267] Update watchForNewFiles to allow watching updated files
[noreply] fix timestamp conversion in Google Cloud Datastore Connector (#17789)
[noreply] Update references to Jira to GH for the Go label (#21830)
[noreply] [#21853] Adjust Go cross-compile to target entire package (#21854)
[Kenneth Knowles] Adjust Jenkins configuration to allow more memory per JVM
[noreply] [BEAM-14553] Add destination coder to FileResultCoder components
[noreply] copyedited README for RunInference examples (#21855)
[noreply] Document and test overriding batch type inference (#21844)
[noreply] Update references to Jira to GH for the Python SDK (#21831)
[noreply] add highlights to changes (#21865)
[noreply] Merge pull request #21793: [21794 ] Fix output timestamp in Dataflow.
[noreply] Adding more info to the sdk_worker_parallelism description (#21839)
[noreply] Add Bert Language Modeling example (#21818)
------------------------------------------
[...truncated 34.46 KB...]
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/15 08:43:28 Using specified **** binary: 'linux_amd64/combine'
2022/06/15 08:43:28 Prepared job with id: load-tests-go-flink-batch-combine-1-0615065325_4810b3f7-956b-47d3-9604-22b1a8015b94 and staging token: load-tests-go-flink-batch-combine-1-0615065325_4810b3f7-956b-47d3-9604-22b1a8015b94
2022/06/15 08:43:33 Staged binary artifact with token:
2022/06/15 08:43:33 Submitted job: load0tests0go0flink0batch0combine0100615065325-root-0615084333-bde1368_30f7c0b4-e63e-47c9-8071-0690c3814c2e
2022/06/15 08:43:33 Job state: STOPPED
2022/06/15 08:43:33 Job state: STARTING
2022/06/15 08:43:33 Job state: RUNNING
2022/06/15 08:43:54 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 543cbe899a9f08aa59a89d2f74157d2b)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
at java.util.ServiceLoader.fail(ServiceLoader.java:239)
at java.util.ServiceLoader.access$300(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:376)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at com.fasterxml.jackson.databind.ObjectMapper.findModules(ObjectMapper.java:1081)
at org.apache.beam.runners.core.construction.SerializablePipelineOptions.<clinit>(SerializablePipelineOptions.java:38)
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getInputSerializer(TaskConfig.java:459)
at org.apache.flink.runtime.operators.DataSinkTask.initInputReaders(DataSinkTask.java:413)
at org.apache.flink.runtime.operators.DataSinkTask.invoke(DataSinkTask.java:117)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/15 08:43:54 (): java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
2022/06/15 08:43:54 Job state: FAILED
2022/06/15 08:43:54 Failed to execute job: job load0tests0go0flink0batch0combine0100615065325-root-0615084333-bde1368_30f7c0b4-e63e-47c9-8071-0690c3814c2e failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100615065325-root-0615084333-bde1368_30f7c0b4-e63e-47c9-8071-0690c3814c2e failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x160c268, 0xc00012e000}, {0x147edbc?, 0x1f95248?}, {0xc0000f3e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 58s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/evedvnropytbc
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #558
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/558/display/redirect?page=changes>
Changes:
[noreply] Bump cloud.google.com/go/pubsub from 1.21.1 to 1.22.2 in /sdks
[dannymccormick] Stop collecting jira metrics
[dannymccormick] Move to contains notation
[dannymccormick] fix query to get all updated issues
[noreply] Refactor code according to keyedModelHandler changes (#21819)
[noreply] Add RunInference API to CHANGES.md (#21754)
[Kenneth Knowles] Do not allow postcommit jobs phrase triggering
[noreply] Refactor API code to base.py in RunInference (#21801)
[noreply] Provide a diagnostic error message when a filesystem scheme is not
[Kiley Sok] Disable more triggers
[noreply] [BEAM-14532] Add integration testing to fhirio Read transform (#17803)
[noreply] Merge pull request #17794 from [#21252] Enforce pubsub message
[noreply] Separated pandas and numpy implementations of sklearn. (#21803)
[noreply] Composite triggers and unit tests for Go SDK (#21756)
[Kiley Sok] Enable phrase trigger for a few post commits
[Kiley Sok] spotless
[noreply] [BEAM-14557] Read and Seek Runner Capabilities in Go SDK (#17821)
[noreply] [BEAM-13806] Add x-lang BigQuery IO integration test to Go SDK. (#16818)
------------------------------------------
[...truncated 34.46 KB...]
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/14 08:43:27 Using specified **** binary: 'linux_amd64/combine'
2022/06/14 08:43:27 Prepared job with id: load-tests-go-flink-batch-combine-1-0614065318_4483c4e8-ec5d-4a92-b4e7-b50135fddf4b and staging token: load-tests-go-flink-batch-combine-1-0614065318_4483c4e8-ec5d-4a92-b4e7-b50135fddf4b
2022/06/14 08:43:31 Staged binary artifact with token:
2022/06/14 08:43:32 Submitted job: load0tests0go0flink0batch0combine0100614065318-root-0614084331-aa9ba5e8_78d4d0cd-7fad-4452-b1f4-a6775f712a2d
2022/06/14 08:43:32 Job state: STOPPED
2022/06/14 08:43:32 Job state: STARTING
2022/06/14 08:43:32 Job state: RUNNING
2022/06/14 08:43:49 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 1ee781e87087cdf0ed507161a8bc1548)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readArray(ObjectInputStream.java:2119)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1657)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1446)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/14 08:43:49 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/14 08:43:49 Job state: FAILED
2022/06/14 08:43:49 Failed to execute job: job load0tests0go0flink0batch0combine0100614065318-root-0614084331-aa9ba5e8_78d4d0cd-7fad-4452-b1f4-a6775f712a2d failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100614065318-root-0614084331-aa9ba5e8_78d4d0cd-7fad-4452-b1f4-a6775f712a2d failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x16006c8, 0xc00012e000}, {0x1474794?, 0x1f84068?}, {0xc0002a1e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 53s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/5gksx5pcjvkgi
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #557
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/557/display/redirect>
Changes:
------------------------------------------
[...truncated 34.33 KB...]
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/13 08:43:11 Using specified **** binary: 'linux_amd64/combine'
2022/06/13 08:43:12 Prepared job with id: load-tests-go-flink-batch-combine-1-0610150324_682f54f9-929b-41b8-a076-1e44e4809af1 and staging token: load-tests-go-flink-batch-combine-1-0610150324_682f54f9-929b-41b8-a076-1e44e4809af1
2022/06/13 08:43:18 Staged binary artifact with token:
2022/06/13 08:43:19 Submitted job: load0tests0go0flink0batch0combine0100610150324-root-0613084318-34521d48_f70dc3ba-9994-47d4-87f9-3a70192bb787
2022/06/13 08:43:19 Job state: STOPPED
2022/06/13 08:43:19 Job state: STARTING
2022/06/13 08:43:19 Job state: RUNNING
2022/06/13 08:43:37 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: bf1f035b05cbe1e41f787477cce05654)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.DataSourceTask.initInputFormat(DataSourceTask.java:324)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:106)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/13 08:43:37 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/13 08:43:37 Job state: FAILED
2022/06/13 08:43:37 Failed to execute job: job load0tests0go0flink0batch0combine0100610150324-root-0613084318-34521d48_f70dc3ba-9994-47d4-87f9-3a70192bb787 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100610150324-root-0613084318-34521d48_f70dc3ba-9994-47d4-87f9-3a70192bb787 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ffd88, 0xc0000480c0}, {0x1474339?, 0x1f83068?}, {0xc0005d1e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 46s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/deoomzjdkeunw
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #556
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/556/display/redirect>
Changes:
------------------------------------------
[...truncated 34.40 KB...]
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/12 08:43:16 Using specified **** binary: 'linux_amd64/combine'
2022/06/12 08:43:16 Prepared job with id: load-tests-go-flink-batch-combine-1-0610150324_646d8522-7c8c-4015-b9ff-277ace3c73c9 and staging token: load-tests-go-flink-batch-combine-1-0610150324_646d8522-7c8c-4015-b9ff-277ace3c73c9
2022/06/12 08:43:20 Staged binary artifact with token:
2022/06/12 08:43:21 Submitted job: load0tests0go0flink0batch0combine0100610150324-root-0612084320-8d88b862_6ca15f1a-491e-46cf-a245-47025e39b327
2022/06/12 08:43:21 Job state: STOPPED
2022/06/12 08:43:21 Job state: STARTING
2022/06/12 08:43:21 Job state: RUNNING
2022/06/12 08:43:41 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 0508676fbdceca34af11f1af8593af3a)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor21.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1446)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/12 08:43:41 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/12 08:43:41 Job state: FAILED
2022/06/12 08:43:41 Failed to execute job: job load0tests0go0flink0batch0combine0100610150324-root-0612084320-8d88b862_6ca15f1a-491e-46cf-a245-47025e39b327 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100610150324-root-0612084320-8d88b862_6ca15f1a-491e-46cf-a245-47025e39b327 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ffd88, 0xc0000480c0}, {0x1474339?, 0x1f83068?}, {0xc000149e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 42s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/dv7kz4djvq5xo
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #555
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/555/display/redirect?page=changes>
Changes:
[noreply] [BEAM-13769]Add no_xdist marker for cloudpickle test (#17538)
[noreply] [BEAM-14533] Bump cloudpickle to 2.1.0 (#17780)
[noreply] Add basic byte size estimation for batches (#17771)
[noreply] Add @yields_batches and @yields_elements (#19268)
[noreply] [BEAM-14535] Added support for pandas in sklearn inference runner
[noreply] Merge ModelLoader and InferenceRunner into same class. (#21795)
[noreply] Merge pull request #17589 from [BEAM-14422] Exception testing for
[noreply] Add README for image classification example (#21758)
[anandinguva98] fixup: bug
[noreply] Fix every PR linking to PR 123 (#21802)
[noreply] Add native PubSub IO prototype to Go (#17955)
[noreply] Allow creation of dynamically defined transforms in the Python expansion
[noreply] Make keying of examples explicit. (#21777)
------------------------------------------
[...truncated 34.56 KB...]
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/11 08:43:51 Using specified **** binary: 'linux_amd64/combine'
2022/06/11 08:43:51 Prepared job with id: load-tests-go-flink-batch-combine-1-0610150324_8ec22515-9408-4b26-a246-4d2be6f7bd05 and staging token: load-tests-go-flink-batch-combine-1-0610150324_8ec22515-9408-4b26-a246-4d2be6f7bd05
2022/06/11 08:43:55 Staged binary artifact with token:
2022/06/11 08:43:56 Submitted job: load0tests0go0flink0batch0combine0100610150324-root-0611084355-85bdc559_0799216f-8d52-4ff8-be79-ae8dda6347d2
2022/06/11 08:43:56 Job state: STOPPED
2022/06/11 08:43:56 Job state: STARTING
2022/06/11 08:43:56 Job state: RUNNING
2022/06/11 08:44:14 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: db147de443e57df006b5f60f4a76f9fd)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor21.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
at java.util.ServiceLoader.fail(ServiceLoader.java:239)
at java.util.ServiceLoader.access$300(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:376)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at com.fasterxml.jackson.databind.ObjectMapper.findModules(ObjectMapper.java:1081)
at org.apache.beam.runners.core.construction.SerializablePipelineOptions.<clinit>(SerializablePipelineOptions.java:38)
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1477)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/11 08:44:14 (): java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
2022/06/11 08:44:14 Job state: FAILED
2022/06/11 08:44:14 Failed to execute job: job load0tests0go0flink0batch0combine0100610150324-root-0611084355-85bdc559_0799216f-8d52-4ff8-be79-ae8dda6347d2 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100610150324-root-0611084355-85bdc559_0799216f-8d52-4ff8-be79-ae8dda6347d2 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ffd88, 0xc00012e000}, {0x1474339?, 0x1f83068?}, {0xc0000ede70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 49s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/vtekjfppi5tfi
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #554
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/554/display/redirect?page=changes>
Changes:
[nishantjain] [BEAM-14000] Elastic search IO doesnot work when both username/password
[nishantjain] Fixes issue with httpclientbuilder - Use the existing builder instead of
[nishantjain] moves sslcontext towards starting of function
[nishantjain] adds unit test
[nishantjain] changes unit test to directly built restclient
[nishantjain] changes name of unit test
[nishantjain] adds test to all elasticsearch folder
[nishantjain] updates changes.md
[nishantjain] spotless fix
[dannymccormick] Update dashboards to use gh data instead of jira data
[noreply] Merge pull request #21746: Exclude GCP Java packages from Dependabot
[noreply] Update .test-infra/metrics/grafana/dashboards/source_data_freshness.json
[noreply] Better cross langauge support for dataframe reads. (#21762)
[noreply] Add template_location flag to Go Dataflow runner (#21774)
[noreply] [BEAM-14406] Drain test for SDF truncation in Go SDK (#17814)
[noreply] More Jira -> Issues doc updates (#21770)
[noreply] [BEAM-11104] Add code snippet for Go SDK Self-Checkpointing (#17956)
------------------------------------------
[...truncated 34.35 KB...]
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/10 08:43:16 Using specified **** binary: 'linux_amd64/combine'
2022/06/10 08:43:16 Prepared job with id: load-tests-go-flink-batch-combine-1-0610065328_90469fea-134a-4971-b3a7-fb3eaf1b0bc8 and staging token: load-tests-go-flink-batch-combine-1-0610065328_90469fea-134a-4971-b3a7-fb3eaf1b0bc8
2022/06/10 08:43:20 Staged binary artifact with token:
2022/06/10 08:43:21 Submitted job: load0tests0go0flink0batch0combine0100610065328-root-0610084320-b594f23d_8c953972-5303-474c-b6e5-c79a2d2c919c
2022/06/10 08:43:21 Job state: STOPPED
2022/06/10 08:43:21 Job state: STARTING
2022/06/10 08:43:21 Job state: RUNNING
2022/06/10 08:43:43 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 1f8efa8e936bf976033a53d2e61500e8)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getInputSerializer(TaskConfig.java:459)
at org.apache.flink.runtime.operators.DataSinkTask.initInputReaders(DataSinkTask.java:413)
at org.apache.flink.runtime.operators.DataSinkTask.invoke(DataSinkTask.java:117)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/10 08:43:43 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/10 08:43:43 Job state: FAILED
2022/06/10 08:43:43 Failed to execute job: job load0tests0go0flink0batch0combine0100610065328-root-0610084320-b594f23d_8c953972-5303-474c-b6e5-c79a2d2c919c failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100610065328-root-0610084320-b594f23d_8c953972-5303-474c-b6e5-c79a2d2c919c failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ffd88, 0xc000136000}, {0x1474339?, 0x1f83068?}, {0xc000333e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 49s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/ngsqyv65bnuda
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #553
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/553/display/redirect?page=changes>
Changes:
[dannymccormick] Gather metrics on GH Issues
[dannymccormick] Fixes
[dannymccormick] Fixes
[dannymccormick] Comment + naming fix
[dannymccormick] Conflicts fix
[dannymccormick] Ordering
[dannymccormick] Different fallback for prs/issues
[noreply] Add ability to self-assign issues for non-committers (#21719)
[dannymccormick] Fix sync time
[noreply] Dont try to generate jiras as part of dependency report (#21753)
[noreply] Allow users to comment `.take-issue` without taking (#21755)
[noreply] Merge pull request: [Beam-14528]: Add ISO time format support for
[noreply] Update all links to in progress jiras to issues (#21749)
------------------------------------------
[...truncated 34.36 KB...]
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/09 08:43:29 Using specified **** binary: 'linux_amd64/combine'
2022/06/09 08:43:29 Prepared job with id: load-tests-go-flink-batch-combine-1-0609065322_385a84e0-c8f0-4386-9ba5-444bc636e66f and staging token: load-tests-go-flink-batch-combine-1-0609065322_385a84e0-c8f0-4386-9ba5-444bc636e66f
2022/06/09 08:43:33 Staged binary artifact with token:
2022/06/09 08:43:34 Submitted job: load0tests0go0flink0batch0combine0100609065322-root-0609084333-b98dfe9d_6993addf-14a1-4be6-a716-fbbe9367b3b9
2022/06/09 08:43:34 Job state: STOPPED
2022/06/09 08:43:34 Job state: STARTING
2022/06/09 08:43:34 Job state: RUNNING
2022/06/09 08:43:54 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: bd14c713bc9e6765a0ec8687d35dc319)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.DataSourceTask.initInputFormat(DataSourceTask.java:324)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:106)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/09 08:43:54 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/09 08:43:54 Job state: FAILED
2022/06/09 08:43:54 Failed to execute job: job load0tests0go0flink0batch0combine0100609065322-root-0609084333-b98dfe9d_6993addf-14a1-4be6-a716-fbbe9367b3b9 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100609065322-root-0609084333-b98dfe9d_6993addf-14a1-4be6-a716-fbbe9367b3b9 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ffc68, 0xc00012e000}, {0x14742d3?, 0x1f83048?}, {0xc0005b9e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 36s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/j3rec6n7hsfdi
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #552
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/552/display/redirect?page=changes>
Changes:
[nielm] Fix SpannerIO service call metrics and improve tests.
[Alexey Romanenko] [BEAM-12918] Add PostCommit_Java_Tpcds_Dataflow job
[andyye333] Add Pytorch support for batched keyed examples
[andyye333] Add general support for non-batchable kwargs params; Add
[noreply] [BEAM-12554] Create new instances of FileSink in sink_fn (#17708)
[noreply] DataflowRunner: Experiment added to disable unbounded PCcollection
[vachan] Fix for increased FAILED_PRECONDITION errors in BQ Read API.
[noreply] More flexible Python Callable type. (#17767)
[noreply] Fix typos in README (#17675)
[vachan] Adding comments.
[noreply] Bump google.golang.org/api from 0.81.0 to 0.83.0 in /sdks (#21743)
------------------------------------------
[...truncated 34.50 KB...]
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/08 08:43:50 Using specified **** binary: 'linux_amd64/combine'
2022/06/08 08:43:51 Prepared job with id: load-tests-go-flink-batch-combine-1-0608065320_d9b84640-4ab6-4f1f-a01c-dfbfc48b89b9 and staging token: load-tests-go-flink-batch-combine-1-0608065320_d9b84640-4ab6-4f1f-a01c-dfbfc48b89b9
2022/06/08 08:43:54 Staged binary artifact with token:
2022/06/08 08:43:55 Submitted job: load0tests0go0flink0batch0combine0100608065320-root-0608084355-52cb2078_e1b0df62-fbad-4e12-9026-85e2358d4a26
2022/06/08 08:43:55 Job state: STOPPED
2022/06/08 08:43:55 Job state: STARTING
2022/06/08 08:43:55 Job state: RUNNING
2022/06/08 08:44:16 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: d17a91824068b85327ce36c2f0301e51)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor21.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1477)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/08 08:44:16 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/08 08:44:16 Job state: FAILED
2022/06/08 08:44:16 Failed to execute job: job load0tests0go0flink0batch0combine0100608065320-root-0608084355-52cb2078_e1b0df62-fbad-4e12-9026-85e2358d4a26 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100608065320-root-0608084355-52cb2078_e1b0df62-fbad-4e12-9026-85e2358d4a26 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ffc68, 0xc00012e000}, {0x14742d3?, 0x1f83048?}, {0xc000319e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 57s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/u736n4fywzxl6
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #551
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/551/display/redirect?page=changes>
Changes:
[yathu] [BEAM-14471] Fix PytestUnknownMarkingWarning
[Robert Bradshaw] Populate missing display data for remotely expanded transforms.
[Robert Bradshaw] Add an option to run Python operations in-line when invoked as a remote
[Robert Bradshaw] Pass options underlying runner in remote job service.
[noreply] Update Jira -> Issues in the Readme
[noreply] [Fixes #18679] Ensure that usage of metrics on a template job reports an
[noreply] Clean up uses of == instead of === in ts sdk (#17732)
[Robert Bradshaw] Comment, lint fixes.
[noreply] Mount GCP credentials in local docker environments. (#19265)
[noreply] [BEAM-14068]Add Pytorch inference IT test and example (#17462)
[noreply] [Playground] [Hotfix] Remove autoscrolling from embedded editor (#21717)
------------------------------------------
[...truncated 34.47 KB...]
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/07 08:43:23 Using specified **** binary: 'linux_amd64/combine'
2022/06/07 08:43:24 Prepared job with id: load-tests-go-flink-batch-combine-1-0607065343_1209c8ab-f2a7-41df-bc6b-fb2227eaae15 and staging token: load-tests-go-flink-batch-combine-1-0607065343_1209c8ab-f2a7-41df-bc6b-fb2227eaae15
2022/06/07 08:43:28 Staged binary artifact with token:
2022/06/07 08:43:29 Submitted job: load0tests0go0flink0batch0combine0100607065343-root-0607084328-230ec7ad_46f34d0c-14ff-428e-b35c-e0993cf33f0d
2022/06/07 08:43:29 Job state: STOPPED
2022/06/07 08:43:29 Job state: STARTING
2022/06/07 08:43:29 Job state: RUNNING
2022/06/07 08:43:51 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: c178d73ed1c6bbd92afe6b4145b68e3d)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1477)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/07 08:43:51 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/07 08:43:51 Job state: FAILED
2022/06/07 08:43:51 Failed to execute job: job load0tests0go0flink0batch0combine0100607065343-root-0607084328-230ec7ad_46f34d0c-14ff-428e-b35c-e0993cf33f0d failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100607065343-root-0607084328-230ec7ad_46f34d0c-14ff-428e-b35c-e0993cf33f0d failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ffc68, 0xc00012e000}, {0x14742d3?, 0x1f83048?}, {0xc000693e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 48s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/yll7ngarwgyau
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #550
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/550/display/redirect>
Changes:
------------------------------------------
[...truncated 34.40 KB...]
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/06 08:43:16 Using specified **** binary: 'linux_amd64/combine'
2022/06/06 08:43:16 Prepared job with id: load-tests-go-flink-batch-combine-1-0606065318_1b6893e6-9b1c-4db2-b719-facb7eb1443b and staging token: load-tests-go-flink-batch-combine-1-0606065318_1b6893e6-9b1c-4db2-b719-facb7eb1443b
2022/06/06 08:43:20 Staged binary artifact with token:
2022/06/06 08:43:21 Submitted job: load0tests0go0flink0batch0combine0100606065318-root-0606084320-f85e1d64_3ccc60ea-d87d-4cb5-a34d-8f002678198e
2022/06/06 08:43:21 Job state: STOPPED
2022/06/06 08:43:21 Job state: STARTING
2022/06/06 08:43:21 Job state: RUNNING
2022/06/06 08:43:39 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: cb9ebdaec6c1fdea71e122b0e97ce455)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor30.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1477)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/06 08:43:39 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/06 08:43:39 Job state: FAILED
2022/06/06 08:43:39 Failed to execute job: job load0tests0go0flink0batch0combine0100606065318-root-0606084320-f85e1d64_3ccc60ea-d87d-4cb5-a34d-8f002678198e failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100606065318-root-0606084320-f85e1d64_3ccc60ea-d87d-4cb5-a34d-8f002678198e failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ffc68, 0xc0000480c0}, {0x14742d3?, 0x1f83048?}, {0xc00042de70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 39s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/ed75zzpmchtcw
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #549
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/549/display/redirect>
Changes:
------------------------------------------
[...truncated 34.52 KB...]
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/05 08:43:18 Using specified **** binary: 'linux_amd64/combine'
2022/06/05 08:43:18 Prepared job with id: load-tests-go-flink-batch-combine-1-0605065315_fc881ade-29c0-4883-bfc6-4fbcd5a794a2 and staging token: load-tests-go-flink-batch-combine-1-0605065315_fc881ade-29c0-4883-bfc6-4fbcd5a794a2
2022/06/05 08:43:22 Staged binary artifact with token:
2022/06/05 08:43:23 Submitted job: load0tests0go0flink0batch0combine0100605065315-root-0605084322-d60c2640_8981d1c0-4c4e-4e80-a014-1534c10edeb0
2022/06/05 08:43:23 Job state: STOPPED
2022/06/05 08:43:23 Job state: STARTING
2022/06/05 08:43:23 Job state: RUNNING
2022/06/05 08:43:43 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 3a9d7d2894acf6d2bb297fb630fd15db)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
at java.util.ServiceLoader.fail(ServiceLoader.java:239)
at java.util.ServiceLoader.access$300(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:376)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at com.fasterxml.jackson.databind.ObjectMapper.findModules(ObjectMapper.java:1081)
at org.apache.beam.runners.core.construction.SerializablePipelineOptions.<clinit>(SerializablePipelineOptions.java:38)
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1477)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/05 08:43:43 (): java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
2022/06/05 08:43:43 Job state: FAILED
2022/06/05 08:43:43 Failed to execute job: job load0tests0go0flink0batch0combine0100605065315-root-0605084322-d60c2640_8981d1c0-4c4e-4e80-a014-1534c10edeb0 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100605065315-root-0605084322-d60c2640_8981d1c0-4c4e-4e80-a014-1534c10edeb0 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ffc68, 0xc0000480c0}, {0x14742d3?, 0x1f83048?}, {0xc00016be70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 41s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/ys4youy5tqwik
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #548
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/548/display/redirect?page=changes>
Changes:
[noreply] [BEAM-11167] Updates dill package to version 0.3.5.1 (#17669)
[noreply] [BEAM-6258] Use gRPC 1.33.1 as min version to ensure that we pickup
[noreply] [BEAM-14441] Enable GitHub issues (#17812)
[Pablo Estrada] Revert "Merge pull request #17492 from [BEAM-13945] (FIX) Update Java BQ
[noreply] Alias worker_harness_container_image to sdk_container_image (#17817)
[noreply] [BEAM-14546] Fix errant pass for empty collections in Count (#17813)
[noreply] Merge pull request #17741 from [BEAM-14504] Add support for including
[noreply] Merge pull request #18374 from [BEAM-13945] Roll forward JSON support
[noreply] Merge pull request #17792 from [BEAM-13756] [Playground] Merge Log and
[noreply] Merge pull request #17779: [BEAM-14529] Add integer to float64
[noreply] [BEAM-14556] Honor the formatter installed on the root handler. (#17820)
------------------------------------------
[...truncated 34.48 KB...]
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/04 08:43:41 Using specified **** binary: 'linux_amd64/combine'
2022/06/04 08:43:41 Prepared job with id: load-tests-go-flink-batch-combine-1-0604065318_6a77077e-2cf4-4951-8e16-738a70458185 and staging token: load-tests-go-flink-batch-combine-1-0604065318_6a77077e-2cf4-4951-8e16-738a70458185
2022/06/04 08:43:45 Staged binary artifact with token:
2022/06/04 08:43:46 Submitted job: load0tests0go0flink0batch0combine0100604065318-root-0604084346-28962f6c_af7f3da1-d623-472a-9b75-709f611d1c68
2022/06/04 08:43:47 Job state: STOPPED
2022/06/04 08:43:47 Job state: STARTING
2022/06/04 08:43:47 Job state: RUNNING
2022/06/04 08:44:05 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 1a685122ef16b4d53cdfbc9422ae0097)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1446)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/04 08:44:05 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/04 08:44:05 Job state: FAILED
2022/06/04 08:44:05 Failed to execute job: job load0tests0go0flink0batch0combine0100604065318-root-0604084346-28962f6c_af7f3da1-d623-472a-9b75-709f611d1c68 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100604065318-root-0604084346-28962f6c_af7f3da1-d623-472a-9b75-709f611d1c68 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ffc68, 0xc00012e000}, {0x14742d3?, 0x1f83048?}, {0xc0000ffe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 47s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/cmxh2spsy7iny
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #547
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/547/display/redirect?page=changes>
Changes:
[dannymccormick] [BEAM-14446] Update some docs to point to GitHub issues
[dannymccormick] More doc updates
[dannymccormick] Update issueManagement fields
[dannymccormick] Fix website build
[dannymccormick] Remove extraneous comment line
[noreply] Commit message guidance
[noreply] [BEAM-14539] Ensure that the print stream can handle larger byte arrays
[noreply] [BEAM-10976] Fix bug with bundle finalization on SDFs (and a small doc
[noreply] Bump google.golang.org/grpc from 1.46.2 to 1.47.0 in /sdks (#17806)
[noreply] Rename pytorch files (#17798)
[noreply] Merge pull request #17492 from [BEAM-13945] (FIX) Update Java BQ
[noreply] [BEAM-11105] Add more watermark estimation docs for go (#17785)
[noreply] [BEAM-11106] documentation for SDF truncation in Go (#17781)
------------------------------------------
[...truncated 34.50 KB...]
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/03 08:43:59 Using specified **** binary: 'linux_amd64/combine'
2022/06/03 08:43:59 Prepared job with id: load-tests-go-flink-batch-combine-1-0603065401_11d43307-4333-4136-98e0-9759fe19ec4a and staging token: load-tests-go-flink-batch-combine-1-0603065401_11d43307-4333-4136-98e0-9759fe19ec4a
2022/06/03 08:44:03 Staged binary artifact with token:
2022/06/03 08:44:04 Submitted job: load0tests0go0flink0batch0combine0100603065401-root-0603084404-2f8b7351_18195830-c065-4ff9-91a2-bc310220753e
2022/06/03 08:44:04 Job state: STOPPED
2022/06/03 08:44:04 Job state: STARTING
2022/06/03 08:44:04 Job state: RUNNING
2022/06/03 08:44:25 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 5e4dbb3359945d1e092a01ae62a19c6c)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getInputSerializer(TaskConfig.java:459)
at org.apache.flink.runtime.operators.DataSinkTask.initInputReaders(DataSinkTask.java:413)
at org.apache.flink.runtime.operators.DataSinkTask.invoke(DataSinkTask.java:117)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/03 08:44:25 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/03 08:44:25 Job state: FAILED
2022/06/03 08:44:25 Failed to execute job: job load0tests0go0flink0batch0combine0100603065401-root-0603084404-2f8b7351_18195830-c065-4ff9-91a2-bc310220753e failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100603065401-root-0603084404-2f8b7351_18195830-c065-4ff9-91a2-bc310220753e failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ffb48, 0xc0000480c0}, {0x14742c0?, 0x1f83048?}, {0xc0003cbe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 57s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/x4qvrxmmzof5s
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #546
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/546/display/redirect?page=changes>
Changes:
[Kenneth Knowles] Fix parsing of -PenableCheckerFramework in build
[Kenneth Knowles] Fix additional nullness errors in BigQueryIO
[yathu] [BEAM-13984] followup Fix precommit
[noreply] [BEAM-14513] Add read transform and initial healthcare client (#17748)
[noreply] [BEAM-14536] Handle 0.0 splits in offsetrange restriction (#17782)
[noreply] [BEAM-14470] Use lifecycle method names directly. (#17790)
[noreply] [BEAM-14297] add nullable annotations and an integration test (#17742)
[noreply] Only generate Javadocs for latest Spark runner version (Spark 3) to fix
[noreply] Fail Javadoc aggregateJavadoc task if there's an error (#17801)
[noreply] Merge pull request #17753 from [BEAM-14510] adding exception tests to
[noreply] feat: allow for unknown values in change streams (#17655)
[noreply] Support JdbcIO autosharding in Python (#16921)
[noreply] [BEAM-14511] Growable Tracker for Go SDK (#17754)
------------------------------------------
[...truncated 34.45 KB...]
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/02 08:43:36 Using specified **** binary: 'linux_amd64/combine'
2022/06/02 08:43:36 Prepared job with id: load-tests-go-flink-batch-combine-1-0518185346_bed7c0cc-e15e-4a1a-be7c-4cebc6bc3ce9 and staging token: load-tests-go-flink-batch-combine-1-0518185346_bed7c0cc-e15e-4a1a-be7c-4cebc6bc3ce9
2022/06/02 08:43:40 Staged binary artifact with token:
2022/06/02 08:43:41 Submitted job: load0tests0go0flink0batch0combine0100518185346-root-0602084341-c66930bb_b779b8e5-4268-4526-a008-9a8704b521e9
2022/06/02 08:43:41 Job state: STOPPED
2022/06/02 08:43:41 Job state: STARTING
2022/06/02 08:43:41 Job state: RUNNING
2022/06/02 08:44:01 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 66b6c091c2679b8a4f154faa7ae62ba5)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1477)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/02 08:44:01 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/02 08:44:01 Job state: FAILED
2022/06/02 08:44:01 Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0602084341-c66930bb_b779b8e5-4268-4526-a008-9a8704b521e9 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0602084341-c66930bb_b779b8e5-4268-4526-a008-9a8704b521e9 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15fe7c8, 0xc0000480d0}, {0x1473100?, 0x1f82048?}, {0xc0005f1e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 56s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/se5sg77xcwkxc
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #545
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/545/display/redirect?page=changes>
Changes:
[chamikaramj] Adds Java cross-language transforms for invoking Python Map and FlatMap
[noreply] [BEAM-14255] Drop clock abstraction (#17671)
[noreply] Adds __repr__ to NullableCoder (#17757)
[noreply] Merge pull request #17683 from [BEAM-14475] add test cases to GcsUtil
[noreply] [BEAM-14410] Add test to demonstrate BEAM-14410 issue in non-cython
[noreply] [BEAM-14449] Support cluster provisioning when using Flink on Dataproc
[noreply] [BEAM-14527] Implement "Beam Summit 2022" banner (#17776)
[noreply] Merge pull request #17222 from [BEAM-12164] Feat: Add new restriction
[noreply] Merge pull request #17598 from [BEAM-14451] Support export to BigQuery
[noreply] Add typing information to RunInferrence. (#17762)
------------------------------------------
[...truncated 34.41 KB...]
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/06/01 08:43:22 Using specified **** binary: 'linux_amd64/combine'
2022/06/01 08:43:23 Prepared job with id: load-tests-go-flink-batch-combine-1-0518185346_eee7620e-f55e-41b7-b8e1-a886efc8cded and staging token: load-tests-go-flink-batch-combine-1-0518185346_eee7620e-f55e-41b7-b8e1-a886efc8cded
2022/06/01 08:43:27 Staged binary artifact with token:
2022/06/01 08:43:28 Submitted job: load0tests0go0flink0batch0combine0100518185346-root-0601084327-2c0df3d2_59e1b8bc-ea9c-4fcf-ab9b-053337dbcfa3
2022/06/01 08:43:28 Job state: STOPPED
2022/06/01 08:43:28 Job state: STARTING
2022/06/01 08:43:28 Job state: RUNNING
2022/06/01 08:43:48 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: d66fa4fdb7abc540300efc1e6b16279c)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor21.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.DataSourceTask.initInputFormat(DataSourceTask.java:324)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:106)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/06/01 08:43:48 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/06/01 08:43:48 Job state: FAILED
2022/06/01 08:43:48 Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0601084327-2c0df3d2_59e1b8bc-ea9c-4fcf-ab9b-053337dbcfa3 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0601084327-2c0df3d2_59e1b8bc-ea9c-4fcf-ab9b-053337dbcfa3 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15fe3e8, 0xc0000480d0}, {0x1472e2c?, 0x1f81048?}, {0xc000673e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 46s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/vwkswkvgpvbig
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #544
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/544/display/redirect>
Changes:
------------------------------------------
[...truncated 34.50 KB...]
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/31 08:43:39 Using specified **** binary: 'linux_amd64/combine'
2022/05/31 08:43:39 Prepared job with id: load-tests-go-flink-batch-combine-1-0518185346_67687b9c-60a7-4146-a271-95c5838ac762 and staging token: load-tests-go-flink-batch-combine-1-0518185346_67687b9c-60a7-4146-a271-95c5838ac762
2022/05/31 08:43:43 Staged binary artifact with token:
2022/05/31 08:43:44 Submitted job: load0tests0go0flink0batch0combine0100518185346-root-0531084343-294e4cb7_0e3aeeea-b734-4e32-8705-72feafb9bb6c
2022/05/31 08:43:44 Job state: STOPPED
2022/05/31 08:43:44 Job state: STARTING
2022/05/31 08:43:44 Job state: RUNNING
2022/05/31 08:44:06 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: de54be4d55d56a95576c3e490313ea21)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor24.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1477)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/31 08:44:06 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/05/31 08:44:06 Job state: FAILED
2022/05/31 08:44:06 Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0531084343-294e4cb7_0e3aeeea-b734-4e32-8705-72feafb9bb6c failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0531084343-294e4cb7_0e3aeeea-b734-4e32-8705-72feafb9bb6c failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15fe3e8, 0xc0000480d0}, {0x1472e2c?, 0x1f81048?}, {0xc0002a5e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 49s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/ayaf2ofnlw4he
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #543
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/543/display/redirect?page=changes>
Changes:
[noreply] [BEAM-14170] - Create a test that runs sickbayed tests (#17471)
------------------------------------------
[...truncated 34.44 KB...]
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/30 08:43:14 Using specified **** binary: 'linux_amd64/combine'
2022/05/30 08:43:15 Prepared job with id: load-tests-go-flink-batch-combine-1-0518185346_a2011b05-6d73-43d9-9922-38f75a8926bf and staging token: load-tests-go-flink-batch-combine-1-0518185346_a2011b05-6d73-43d9-9922-38f75a8926bf
2022/05/30 08:43:18 Staged binary artifact with token:
2022/05/30 08:43:20 Submitted job: load0tests0go0flink0batch0combine0100518185346-root-0530084319-3e1212e1_967470f4-9d9f-4def-a9a6-2cabaf569000
2022/05/30 08:43:20 Job state: STOPPED
2022/05/30 08:43:20 Job state: STARTING
2022/05/30 08:43:20 Job state: RUNNING
2022/05/30 08:43:41 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 13988d02715daffcfddc44fb72589733)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor24.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
at java.util.ServiceLoader.fail(ServiceLoader.java:239)
at java.util.ServiceLoader.access$300(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:376)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at com.fasterxml.jackson.databind.ObjectMapper.findModules(ObjectMapper.java:1081)
at org.apache.beam.runners.core.construction.SerializablePipelineOptions.<clinit>(SerializablePipelineOptions.java:38)
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1477)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/30 08:43:41 (): java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
2022/05/30 08:43:41 Job state: FAILED
2022/05/30 08:43:41 Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0530084319-3e1212e1_967470f4-9d9f-4def-a9a6-2cabaf569000 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0530084319-3e1212e1_967470f4-9d9f-4def-a9a6-2cabaf569000 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15fe3e8, 0xc0000480d0}, {0x1472e2c?, 0x1f81048?}, {0xc0005c1e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 47s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/4mjd2ok5ijtki
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #542
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/542/display/redirect>
Changes:
------------------------------------------
[...truncated 34.51 KB...]
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/29 08:43:28 Using specified **** binary: 'linux_amd64/combine'
2022/05/29 08:43:28 Prepared job with id: load-tests-go-flink-batch-combine-1-0518185346_ab8721b9-126f-4476-a758-6152cc4a8431 and staging token: load-tests-go-flink-batch-combine-1-0518185346_ab8721b9-126f-4476-a758-6152cc4a8431
2022/05/29 08:43:32 Staged binary artifact with token:
2022/05/29 08:43:33 Submitted job: load0tests0go0flink0batch0combine0100518185346-root-0529084332-30349353_ed5a3b37-ef25-4010-9c6f-44ca7f352d08
2022/05/29 08:43:33 Job state: STOPPED
2022/05/29 08:43:33 Job state: STARTING
2022/05/29 08:43:33 Job state: RUNNING
2022/05/29 08:43:52 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 73a0907d3620ad514981dc16ed4bfefb)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor24.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readArray(ObjectInputStream.java:2119)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1657)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1446)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/29 08:43:52 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/05/29 08:43:52 Job state: FAILED
2022/05/29 08:43:52 Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0529084332-30349353_ed5a3b37-ef25-4010-9c6f-44ca7f352d08 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0529084332-30349353_ed5a3b37-ef25-4010-9c6f-44ca7f352d08 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15fe3e8, 0xc0000480d0}, {0x1472e2c?, 0x1f81048?}, {0xc00064fe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 40s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/uaem57bud7uao
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #541
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/541/display/redirect?page=changes>
Changes:
[ilion.beyst] minor: don't capture stderr in kata tests
[Kiley Sok] Update beam-master version for legacy
[Heejong Lee] Fix NonType error when importing google.api_core fails
[noreply] [BEAM-13972] Update documentation for run inference (#17508)
[noreply] [BEAM-14502] Fix: Splitting scans into smaller chunks to buffer reads
[noreply] [BEAM-14218] Add resource location hints to base inference runner.
[noreply] [BEAM-14442] Ask for repro steps/redirect to user list in bug template
[noreply] [BEAM-14166] Push logic in RowWithGetters down into getters and use
[noreply] cleaned up TypeScript in coders.ts (#17689)
------------------------------------------
[...truncated 34.47 KB...]
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/28 08:43:34 Using specified **** binary: 'linux_amd64/combine'
2022/05/28 08:43:35 Prepared job with id: load-tests-go-flink-batch-combine-1-0518185346_d6875c24-bb1a-43b4-be12-3363caf5f75d and staging token: load-tests-go-flink-batch-combine-1-0518185346_d6875c24-bb1a-43b4-be12-3363caf5f75d
2022/05/28 08:43:38 Staged binary artifact with token:
2022/05/28 08:43:39 Submitted job: load0tests0go0flink0batch0combine0100518185346-root-0528084339-501b427_610ada7f-e447-438a-b7ef-3e306b2abafd
2022/05/28 08:43:39 Job state: STOPPED
2022/05/28 08:43:39 Job state: STARTING
2022/05/28 08:43:39 Job state: RUNNING
2022/05/28 08:44:02 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 79e3d74c36fd93f6fe19e768c27488b7)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1477)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/28 08:44:02 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/05/28 08:44:02 Job state: FAILED
2022/05/28 08:44:02 Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0528084339-501b427_610ada7f-e447-438a-b7ef-3e306b2abafd failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0528084339-501b427_610ada7f-e447-438a-b7ef-3e306b2abafd failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15fe3e8, 0xc00012e000}, {0x1472e2c?, 0x1f81048?}, {0xc00023be70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 48s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/lv2j5nu5esooi
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #540
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/540/display/redirect?page=changes>
Changes:
[thiagotnunes] BEAM-14419: Remove invalid mod type
[ihr] [BEAM-14006] Update Python katas to 2.38 and fix issue with one test
[Heejong Lee] [BEAM-14478] Fix missing 'projectId' attribute error
[relax] DLQ for BQ Storage Api writes
[noreply] Bump google.golang.org/api from 0.76.0 to 0.81.0 in /sdks
[noreply] [BEAM-14336] Re-enable `flight_delays_it_test` with
[noreply] [BEAM-11106] small nits to truncate sdf exec unit (#17755)
[noreply] Added standard logging when exception is thrown (#17717)
[noreply] [BEAM-13829] Enable worker status in Go
[noreply] [BEAM-14519] Add website page for Go dependencies (#17766)
[noreply] [BEAM-11106] Validate that DoFn returns Process continuation when
[noreply] [BEAM-14505] Add Dataflow streaming pipeline update support to the Go
------------------------------------------
[...truncated 34.31 KB...]
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/27 08:43:21 Using specified **** binary: 'linux_amd64/combine'
2022/05/27 08:43:21 Prepared job with id: load-tests-go-flink-batch-combine-1-0518185346_e223bace-b598-46a1-9072-0d3979ed04f3 and staging token: load-tests-go-flink-batch-combine-1-0518185346_e223bace-b598-46a1-9072-0d3979ed04f3
2022/05/27 08:43:25 Staged binary artifact with token:
2022/05/27 08:43:26 Submitted job: load0tests0go0flink0batch0combine0100518185346-root-0527084326-feb4a5a8_bfc3b04a-9f2f-4628-942f-e6c93c9d7762
2022/05/27 08:43:26 Job state: STOPPED
2022/05/27 08:43:26 Job state: STARTING
2022/05/27 08:43:26 Job state: RUNNING
2022/05/27 08:43:44 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 17b4533fe8b842ae7afe9f1171d0d5ae)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getInputSerializer(TaskConfig.java:459)
at org.apache.flink.runtime.operators.DataSinkTask.initInputReaders(DataSinkTask.java:413)
at org.apache.flink.runtime.operators.DataSinkTask.invoke(DataSinkTask.java:117)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/27 08:43:44 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/05/27 08:43:44 Job state: FAILED
2022/05/27 08:43:44 Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0527084326-feb4a5a8_bfc3b04a-9f2f-4628-942f-e6c93c9d7762 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0527084326-feb4a5a8_bfc3b04a-9f2f-4628-942f-e6c93c9d7762 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15fe3e8, 0xc0000480d0}, {0x1472e2c?, 0x1f81048?}, {0xc000571e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 53s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/7ctlt7i6sk7g6
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #539
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/539/display/redirect?page=changes>
Changes:
[Robert Bradshaw] [BEAM-14426] Allow skipping of any output when writing an empty
[Robert Bradshaw] Add skip_if_empty attribute to base class to fix test.
[Jan Lukavský] [BEAM-14492] add flinkConfDir to FlinkPipelineOptions
[noreply] Bump cloud.google.com/go/storage from 1.22.0 to 1.22.1 in /sdks
[noreply] [BEAM-14139] Remove unused Flink 1.11 directory (#17750)
[noreply] [BEAM-14044] Allow ModelLoader to forward BatchElements args (#17527)
[noreply] [BEAM-14481] Remove unnecessary context (#17737)
[noreply] [BEAM-9324] Fix incompatibility of direct runner with cython (#17728)
[noreply] [BEAM-14503] Add support for Flink 1.15 (#17739)
[noreply] Update Beam website to release 2.39.0 (#17690)
[noreply] [BEAM-14509] Add several flags to dataflow runner (#17752)
[Yichi Zhang] Fix 2.38.0 download page.
[noreply] [BEAM-14494] Fix publish_docker_images.sh (#17756)
------------------------------------------
[...truncated 34.56 KB...]
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/26 08:43:49 Using specified **** binary: 'linux_amd64/combine'
2022/05/26 08:43:49 Prepared job with id: load-tests-go-flink-batch-combine-1-0518185346_9dafc59b-65fb-4e04-a192-ddd0e589db4c and staging token: load-tests-go-flink-batch-combine-1-0518185346_9dafc59b-65fb-4e04-a192-ddd0e589db4c
2022/05/26 08:43:53 Staged binary artifact with token:
2022/05/26 08:43:54 Submitted job: load0tests0go0flink0batch0combine0100518185346-root-0526084354-e0eb5a04_85bcbfe0-0b1d-4eb5-bc0d-26c0098a73a9
2022/05/26 08:43:54 Job state: STOPPED
2022/05/26 08:43:54 Job state: STARTING
2022/05/26 08:43:54 Job state: RUNNING
2022/05/26 08:44:15 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 7670e73bdd3c08ce17494e7f5c607537)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor20.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
at java.util.ServiceLoader.fail(ServiceLoader.java:239)
at java.util.ServiceLoader.access$300(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:376)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at com.fasterxml.jackson.databind.ObjectMapper.findModules(ObjectMapper.java:1081)
at org.apache.beam.runners.core.construction.SerializablePipelineOptions.<clinit>(SerializablePipelineOptions.java:38)
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getInputSerializer(TaskConfig.java:459)
at org.apache.flink.runtime.operators.DataSinkTask.initInputReaders(DataSinkTask.java:413)
at org.apache.flink.runtime.operators.DataSinkTask.invoke(DataSinkTask.java:117)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/26 08:44:15 (): java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
2022/05/26 08:44:15 Job state: FAILED
2022/05/26 08:44:15 Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0526084354-e0eb5a04_85bcbfe0-0b1d-4eb5-bc0d-26c0098a73a9 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0526084354-e0eb5a04_85bcbfe0-0b1d-4eb5-bc0d-26c0098a73a9 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15fce08, 0xc00012e000}, {0x1471bf0?, 0x1f7f048?}, {0xc000671e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 56s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/36nqgwurjxif2
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #538
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/538/display/redirect?page=changes>
Changes:
[Heejong Lee] [BEAM-14471] Adding testcases and examples for xlang Python
[Heejong Lee] update
[Heejong Lee] add DataframeTransform wrapper
[noreply] [BEAM-14298] resolve dependency
[noreply] Fix -- linting issue (#17738)
[noreply] Fix 'NoneType' object has no attribute error
[noreply] [BEAM-12308] change expected value in kakfa IT (#17740)
[noreply] [BEAM-14053] [CdapIO] Add wrapper class for CDAP plugin (#17150)
[noreply] [BEAM-14129] Clean up PubsubLiteIO by removing options that no longer
[noreply] [BEAM-14496] Ensure that precombine is inheriting one of the timestamps
------------------------------------------
[...truncated 34.47 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/25 08:43:44 Using specified **** binary: 'linux_amd64/combine'
2022/05/25 08:43:45 Prepared job with id: load-tests-go-flink-batch-combine-1-0518185346_326492b5-fa7f-4f08-a15f-07114ebb6008 and staging token: load-tests-go-flink-batch-combine-1-0518185346_326492b5-fa7f-4f08-a15f-07114ebb6008
2022/05/25 08:43:49 Staged binary artifact with token:
2022/05/25 08:43:50 Submitted job: load0tests0go0flink0batch0combine0100518185346-root-0525084349-22b817e4_1e34b1c8-dcf8-4a87-8a83-6725d8b20c68
2022/05/25 08:43:50 Job state: STOPPED
2022/05/25 08:43:50 Job state: STARTING
2022/05/25 08:43:50 Job state: RUNNING
2022/05/25 08:44:08 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: d036ea4aa8ce6913b675ca52eff561c1)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getInputSerializer(TaskConfig.java:459)
at org.apache.flink.runtime.operators.DataSinkTask.initInputReaders(DataSinkTask.java:413)
at org.apache.flink.runtime.operators.DataSinkTask.invoke(DataSinkTask.java:117)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/25 08:44:08 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/05/25 08:44:08 Job state: FAILED
2022/05/25 08:44:08 Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0525084349-22b817e4_1e34b1c8-dcf8-4a87-8a83-6725d8b20c68 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0525084349-22b817e4_1e34b1c8-dcf8-4a87-8a83-6725d8b20c68 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15b89c8, 0xc000136000}, {0x1433b54?, 0x1f1b300?}, {0xc000359e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 46s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/glngzxcsjre2q
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #537
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/537/display/redirect?page=changes>
Changes:
[noreply] Add clarification on Filter transform's input function to pydoc.
[noreply] [BEAM-14367]Flaky timeout in
[noreply] [BEAM-14494] Tag rc dockre container with format ${RELEASE}rc${RC_NUM}
[noreply] [BEAM-11578] Fix TypeError in dataflow_metrics has 0 distribution sum
[noreply] [BEAM-14499] Step global, unbounded side input case back to warning
[noreply] [BEAM-14484] Step back unexpected primary handling to warnings (#17724)
[noreply] [BEAM-14486] Document pubsubio & fix its behavior. (#17709)
[noreply] [BEAM-14489] Remove non-SDF version of TextIO. (#17712)
------------------------------------------
[...truncated 34.42 KB...]
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/24 08:43:56 Using specified **** binary: 'linux_amd64/combine'
2022/05/24 08:43:56 Prepared job with id: load-tests-go-flink-batch-combine-1-0518185346_9ff88a48-e78c-47e0-9b39-73d3e72fa173 and staging token: load-tests-go-flink-batch-combine-1-0518185346_9ff88a48-e78c-47e0-9b39-73d3e72fa173
2022/05/24 08:44:00 Staged binary artifact with token:
2022/05/24 08:44:01 Submitted job: load0tests0go0flink0batch0combine0100518185346-root-0524084400-9610a03a_52bd2214-0311-4c02-b85d-590c6333616f
2022/05/24 08:44:01 Job state: STOPPED
2022/05/24 08:44:01 Job state: STARTING
2022/05/24 08:44:01 Job state: RUNNING
2022/05/24 08:44:19 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: f27c3c81828f97c679e928a754da91b6)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1477)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/24 08:44:19 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/05/24 08:44:19 Job state: FAILED
2022/05/24 08:44:19 Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0524084400-9610a03a_52bd2214-0311-4c02-b85d-590c6333616f failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0524084400-9610a03a_52bd2214-0311-4c02-b85d-590c6333616f failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15b89c8, 0xc00012e000}, {0x1433b54?, 0x1f1b300?}, {0xc000355e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 2s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/qoeioo5vvvhxq
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #536
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/536/display/redirect>
Changes:
------------------------------------------
[...truncated 34.45 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/23 08:43:41 Using specified **** binary: 'linux_amd64/combine'
2022/05/23 08:43:42 Prepared job with id: load-tests-go-flink-batch-combine-1-0518185346_114ec6b5-6106-4b42-b5cb-b78e9015cd85 and staging token: load-tests-go-flink-batch-combine-1-0518185346_114ec6b5-6106-4b42-b5cb-b78e9015cd85
2022/05/23 08:43:46 Staged binary artifact with token:
2022/05/23 08:43:47 Submitted job: load0tests0go0flink0batch0combine0100518185346-root-0523084347-73d7f578_a66894fd-0fde-45ac-9a36-62bbb6165210
2022/05/23 08:43:47 Job state: STOPPED
2022/05/23 08:43:47 Job state: STARTING
2022/05/23 08:43:47 Job state: RUNNING
2022/05/23 08:44:09 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: ff4ea6a556459fad8f4fc6cf7c383ff1)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor26.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getInputSerializer(TaskConfig.java:459)
at org.apache.flink.runtime.operators.DataSinkTask.initInputReaders(DataSinkTask.java:413)
at org.apache.flink.runtime.operators.DataSinkTask.invoke(DataSinkTask.java:117)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/23 08:44:09 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/05/23 08:44:10 Job state: FAILED
2022/05/23 08:44:10 Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0523084347-73d7f578_a66894fd-0fde-45ac-9a36-62bbb6165210 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0523084347-73d7f578_a66894fd-0fde-45ac-9a36-62bbb6165210 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15b9b68, 0xc0000480d0}, {0x1434c74?, 0x1f1d300?}, {0xc000179e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 48s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/54jxysne7umtu
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #535
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/535/display/redirect>
Changes:
------------------------------------------
[...truncated 34.46 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/22 08:43:30 Using specified **** binary: 'linux_amd64/combine'
2022/05/22 08:43:30 Prepared job with id: load-tests-go-flink-batch-combine-1-0518185346_f3fb28de-a170-408a-b40a-f4bbcdfba933 and staging token: load-tests-go-flink-batch-combine-1-0518185346_f3fb28de-a170-408a-b40a-f4bbcdfba933
2022/05/22 08:43:34 Staged binary artifact with token:
2022/05/22 08:43:35 Submitted job: load0tests0go0flink0batch0combine0100518185346-root-0522084334-4f66e870_a1a42398-c9ff-4c70-8ff0-b13ea9b14b19
2022/05/22 08:43:35 Job state: STOPPED
2022/05/22 08:43:35 Job state: STARTING
2022/05/22 08:43:35 Job state: RUNNING
2022/05/22 08:43:55 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 32c71f130c484cef214c87235f32b74e)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getInputSerializer(TaskConfig.java:459)
at org.apache.flink.runtime.operators.DataSinkTask.initInputReaders(DataSinkTask.java:413)
at org.apache.flink.runtime.operators.DataSinkTask.invoke(DataSinkTask.java:117)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/22 08:43:55 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/05/22 08:43:55 Job state: FAILED
2022/05/22 08:43:55 Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0522084334-4f66e870_a1a42398-c9ff-4c70-8ff0-b13ea9b14b19 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0522084334-4f66e870_a1a42398-c9ff-4c70-8ff0-b13ea9b14b19 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15b9b68, 0xc00012e000}, {0x1434c74?, 0x1f1d300?}, {0xc0000f9e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 44s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/myb6atq6f6aru
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #534
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/534/display/redirect?page=changes>
Changes:
[yathu] Add labels for typescript PRs
[bulat.safiullin] [BEAM-14418] added arrows to slider
[noreply] Minor: Bump Dataflow container versions (#17684)
[noreply] Bump google.golang.org/grpc from 1.45.0 to 1.46.2 in /sdks (#17677)
[noreply] [BEAM-13015] Only create a TimerBundleTracker if there are timers.
------------------------------------------
[...truncated 34.64 KB...]
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/21 08:44:00 Using specified **** binary: 'linux_amd64/combine'
2022/05/21 08:44:01 Prepared job with id: load-tests-go-flink-batch-combine-1-0518185346_5f1925a9-0cd2-44c1-830d-2fea40605a97 and staging token: load-tests-go-flink-batch-combine-1-0518185346_5f1925a9-0cd2-44c1-830d-2fea40605a97
2022/05/21 08:44:05 Staged binary artifact with token:
2022/05/21 08:44:06 Submitted job: load0tests0go0flink0batch0combine0100518185346-root-0521084405-277a7b04_0e015b83-0feb-4a56-83f5-fc53752587bf
2022/05/21 08:44:06 Job state: STOPPED
2022/05/21 08:44:06 Job state: STARTING
2022/05/21 08:44:06 Job state: RUNNING
2022/05/21 08:44:28 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: c433cf31e4202c7e8d4418f5bd10c3c3)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
at java.util.ServiceLoader.fail(ServiceLoader.java:239)
at java.util.ServiceLoader.access$300(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:376)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at com.fasterxml.jackson.databind.ObjectMapper.findModules(ObjectMapper.java:1081)
at org.apache.beam.runners.core.construction.SerializablePipelineOptions.<clinit>(SerializablePipelineOptions.java:38)
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getInputSerializer(TaskConfig.java:459)
at org.apache.flink.runtime.operators.DataSinkTask.initInputReaders(DataSinkTask.java:413)
at org.apache.flink.runtime.operators.DataSinkTask.invoke(DataSinkTask.java:117)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/21 08:44:28 (): java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
2022/05/21 08:44:28 Job state: FAILED
2022/05/21 08:44:28 Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0521084405-277a7b04_0e015b83-0feb-4a56-83f5-fc53752587bf failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0521084405-277a7b04_0e015b83-0feb-4a56-83f5-fc53752587bf failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15b9b68, 0xc000136000}, {0x1434c74?, 0x1f1d300?}, {0xc0000d9e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/p5q5svebbefjy
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #533
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/533/display/redirect?page=changes>
Changes:
[chamikaramj] Corrects I/O connectors availability status in Beam Website.
[singh.vikash2310] fixed typos in README.md
[noreply] Update the PTransform and associated APIs to be less class-based.
[noreply] Vortex performance improvement: Enable multiple stream clients per
[noreply] [BEAM-14488] Alias async flags. (#17711)
[noreply] [BEAM-14487] Make drain & update terminal states. (#17710)
[noreply] [BEAM-14484] Improve behavior surrounding primary roots in
[noreply] Improve validation error message (#17719)
[noreply] Remove unused validation configurations. (#17705)
------------------------------------------
[...truncated 34.33 KB...]
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/20 08:43:23 Using specified **** binary: 'linux_amd64/combine'
2022/05/20 08:43:23 Prepared job with id: load-tests-go-flink-batch-combine-1-0518185346_bd941b30-0c51-435e-a676-4f71d9fc8f58 and staging token: load-tests-go-flink-batch-combine-1-0518185346_bd941b30-0c51-435e-a676-4f71d9fc8f58
2022/05/20 08:43:27 Staged binary artifact with token:
2022/05/20 08:43:28 Submitted job: load0tests0go0flink0batch0combine0100518185346-root-0520084327-dc5e5f96_a6a3e457-1164-40f8-8e0a-23c387a998ae
2022/05/20 08:43:28 Job state: STOPPED
2022/05/20 08:43:28 Job state: STARTING
2022/05/20 08:43:28 Job state: RUNNING
2022/05/20 08:43:46 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: ea4c65b1c50557d8c9d41aac5fae41ab)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.DataSourceTask.initInputFormat(DataSourceTask.java:324)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:106)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/20 08:43:46 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/05/20 08:43:46 Job state: FAILED
2022/05/20 08:43:46 Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0520084327-dc5e5f96_a6a3e457-1164-40f8-8e0a-23c387a998ae failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0520084327-dc5e5f96_a6a3e457-1164-40f8-8e0a-23c387a998ae failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ac9c8, 0xc0001a8000}, {0x1428f49?, 0x1f0b200?}, {0xc0002a9e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 54s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/qtkc3jc2grhne
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #532
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/532/display/redirect?page=changes>
Changes:
[bulat.safiullin] [BEAM-14428] change text, change styling of connectors and contribute
[noreply] [BEAM-14474] Suppress 'Mean of empty slice' Runtime Warning in dataframe
[noreply] [BEAM-10529] update KafkaIO Xlang integration test to publish and
[noreply] Fix a few small linting bugs (#17695)
[noreply] Bump github.com/lib/pq from 1.10.5 to 1.10.6 in /sdks (#17691)
[noreply] Update release-guide.md
------------------------------------------
[...truncated 34.32 KB...]
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/19 08:43:34 Using specified **** binary: 'linux_amd64/combine'
2022/05/19 08:43:34 Prepared job with id: load-tests-go-flink-batch-combine-1-0518185346_e8818338-9e94-42d3-9e99-a344b1c5cb94 and staging token: load-tests-go-flink-batch-combine-1-0518185346_e8818338-9e94-42d3-9e99-a344b1c5cb94
2022/05/19 08:43:38 Staged binary artifact with token:
2022/05/19 08:43:39 Submitted job: load0tests0go0flink0batch0combine0100518185346-root-0519084338-8c086255_d0c410b5-78d1-406b-a65b-4b089fdef93e
2022/05/19 08:43:39 Job state: STOPPED
2022/05/19 08:43:39 Job state: STARTING
2022/05/19 08:43:39 Job state: RUNNING
2022/05/19 08:43:56 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: ae606297ab34d0a29c1c57095546189b)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor26.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1446)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/19 08:43:56 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/05/19 08:43:56 Job state: FAILED
2022/05/19 08:43:57 Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0519084338-8c086255_d0c410b5-78d1-406b-a65b-4b089fdef93e failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100518185346-root-0519084338-8c086255_d0c410b5-78d1-406b-a65b-4b089fdef93e failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ac608, 0xc000136000}, {0x1428d5f?, 0x1f0a200?}, {0xc00026fe70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 57s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/hjg4llizh6uy2
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #531
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/531/display/redirect?page=changes>
Changes:
[mmack] [BEAM-14334] Remove remaining forkEvery 1 from all Spark tests and stop
[noreply] Merge pull request #17678 from [BEAM-14460] [Playground] WIP. Fix error
[Alexey Romanenko] [BEAM-14035] Fix checkstyle issue
[noreply] [BEAM-14441] Automatically assign issue labels based on responses to
[noreply] README update for the Docker Error 255 during Website launch on Apple
[noreply] [BEAM-12000] Update programming-guide.md (#17679)
[noreply] [BEAM-14467] Fix bug where run_pytest.sh does not elevate errors raised
------------------------------------------
[...truncated 34.43 KB...]
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/18 08:43:15 Using specified **** binary: 'linux_amd64/combine'
2022/05/18 08:43:15 Prepared job with id: load-tests-go-flink-batch-combine-1-0518065329_834628e3-74ad-4388-aff9-f87e9f98419b and staging token: load-tests-go-flink-batch-combine-1-0518065329_834628e3-74ad-4388-aff9-f87e9f98419b
2022/05/18 08:43:19 Staged binary artifact with token:
2022/05/18 08:43:20 Submitted job: load0tests0go0flink0batch0combine0100518065329-root-0518084319-6aef0689_372eb619-d6e9-4a6e-8e2e-690308474a58
2022/05/18 08:43:20 Job state: STOPPED
2022/05/18 08:43:20 Job state: STARTING
2022/05/18 08:43:20 Job state: RUNNING
2022/05/18 08:43:38 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 3455039a186d6105a51d96a3851e3f52)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor25.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
at java.util.ServiceLoader.fail(ServiceLoader.java:239)
at java.util.ServiceLoader.access$300(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:376)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at com.fasterxml.jackson.databind.ObjectMapper.findModules(ObjectMapper.java:1081)
at org.apache.beam.runners.core.construction.SerializablePipelineOptions.<clinit>(SerializablePipelineOptions.java:38)
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getInputSerializer(TaskConfig.java:459)
at org.apache.flink.runtime.operators.DataSinkTask.initInputReaders(DataSinkTask.java:413)
at org.apache.flink.runtime.operators.DataSinkTask.invoke(DataSinkTask.java:117)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/18 08:43:38 (): java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
2022/05/18 08:43:38 Job state: FAILED
2022/05/18 08:43:39 Failed to execute job: job load0tests0go0flink0batch0combine0100518065329-root-0518084319-6aef0689_372eb619-d6e9-4a6e-8e2e-690308474a58 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100518065329-root-0518084319-6aef0689_372eb619-d6e9-4a6e-8e2e-690308474a58 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ac608, 0xc00012e000}, {0x1428d5f?, 0x1f0a200?}, {0xc000203e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 42s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/wojc3lulj4vtg
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #530
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/530/display/redirect?page=changes>
Changes:
[noreply] [BEAM-13015] Update the SDK harness grouping table to be memory bounded
[noreply] [BEAM-13982] Added output of logging for python E2E pytests (#17637)
[noreply] [BEAM-14473] Throw error if using globally windowed, unbounded side
[noreply] [BEAM-14440] Add basic fuzz tests to the coders package (#17587)
[noreply] [BEAM-14035 ] Implement BigQuerySchema Read/Write TransformProvider
[noreply] Add Akvelon to case-studies (#17611)
[noreply] Merge pull request #17520 from BEAM-12356 Close DatasetService leaked
[noreply] Adding eslint and lint configuration to TypeScript SDK (#17676)
[noreply] Update release-guide.md
[noreply] Update release-guide.md
[noreply] [BEAM-14411] Re-enable TypecodersTest, fix most issues (#17547)
------------------------------------------
[...truncated 34.27 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/17 08:43:13 Using specified **** binary: 'linux_amd64/combine'
2022/05/17 08:43:13 Prepared job with id: load-tests-go-flink-batch-combine-1-0517065338_225a8043-69e5-4b28-8a11-2855f8da51b2 and staging token: load-tests-go-flink-batch-combine-1-0517065338_225a8043-69e5-4b28-8a11-2855f8da51b2
2022/05/17 08:43:17 Staged binary artifact with token:
2022/05/17 08:43:18 Submitted job: load0tests0go0flink0batch0combine0100517065338-root-0517084317-bf789fb5_eb6d45b5-f136-446d-959d-ed42ef63cba5
2022/05/17 08:43:18 Job state: STOPPED
2022/05/17 08:43:18 Job state: STARTING
2022/05/17 08:43:18 Job state: RUNNING
2022/05/17 08:43:36 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: e016cdb4b05d4d42baca6658f5aebd5e)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor29.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.DataSourceTask.initInputFormat(DataSourceTask.java:324)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:106)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/17 08:43:36 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/05/17 08:43:36 Job state: FAILED
2022/05/17 08:43:36 Failed to execute job: job load0tests0go0flink0batch0combine0100517065338-root-0517084317-bf789fb5_eb6d45b5-f136-446d-959d-ed42ef63cba5 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100517065338-root-0517084317-bf789fb5_eb6d45b5-f136-446d-959d-ed42ef63cba5 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ac608, 0xc00012e000}, {0x1428d5f?, 0x1f0a200?}, {0xc000599e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 49s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/23jiyowflv2qi
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #529
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/529/display/redirect>
Changes:
------------------------------------------
[...truncated 34.24 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/16 08:43:13 Using specified **** binary: 'linux_amd64/combine'
2022/05/16 08:43:14 Prepared job with id: load-tests-go-flink-batch-combine-1-0516065339_ae352713-1211-407a-a5c9-802ccecd0c14 and staging token: load-tests-go-flink-batch-combine-1-0516065339_ae352713-1211-407a-a5c9-802ccecd0c14
2022/05/16 08:43:17 Staged binary artifact with token:
2022/05/16 08:43:18 Submitted job: load0tests0go0flink0batch0combine0100516065339-root-0516084317-ef9343aa_da02a5a9-f3d6-43e0-9f37-b9ebdc98b397
2022/05/16 08:43:18 Job state: STOPPED
2022/05/16 08:43:18 Job state: STARTING
2022/05/16 08:43:18 Job state: RUNNING
2022/05/16 08:43:36 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: c881a7c7f7eddab40202eaa0eaeb101e)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.DataSourceTask.initInputFormat(DataSourceTask.java:324)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:106)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/16 08:43:36 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/05/16 08:43:36 Job state: FAILED
2022/05/16 08:43:36 Failed to execute job: job load0tests0go0flink0batch0combine0100516065339-root-0516084317-ef9343aa_da02a5a9-f3d6-43e0-9f37-b9ebdc98b397 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100516065339-root-0516084317-ef9343aa_da02a5a9-f3d6-43e0-9f37-b9ebdc98b397 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ac528, 0xc00012e000}, {0x1428d5f?, 0x1f0a200?}, {0xc0002e3e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 43s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/jv6frduorkmpg
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #528
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/528/display/redirect?page=changes>
Changes:
[noreply] [BEAM-14470] Use Generic Registrations in loadtests. (#17673)
------------------------------------------
[...truncated 34.49 KB...]
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/15 08:43:31 Using specified **** binary: 'linux_amd64/combine'
2022/05/15 08:43:32 Prepared job with id: load-tests-go-flink-batch-combine-1-0515065350_a084b106-ae70-4cfe-a7e1-ce6a4843ecf0 and staging token: load-tests-go-flink-batch-combine-1-0515065350_a084b106-ae70-4cfe-a7e1-ce6a4843ecf0
2022/05/15 08:43:35 Staged binary artifact with token:
2022/05/15 08:43:36 Submitted job: load0tests0go0flink0batch0combine0100515065350-root-0515084336-9b6d9dae_997be6b6-1495-485e-87f2-df975d2cf9a8
2022/05/15 08:43:36 Job state: STOPPED
2022/05/15 08:43:36 Job state: STARTING
2022/05/15 08:43:36 Job state: RUNNING
2022/05/15 08:43:59 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: c6a08108ea71608a04d863f1cef367c2)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor26.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readArray(ObjectInputStream.java:2119)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1657)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1446)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/15 08:43:59 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/05/15 08:43:59 Job state: FAILED
2022/05/15 08:43:59 Failed to execute job: job load0tests0go0flink0batch0combine0100515065350-root-0515084336-9b6d9dae_997be6b6-1495-485e-87f2-df975d2cf9a8 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100515065350-root-0515084336-9b6d9dae_997be6b6-1495-485e-87f2-df975d2cf9a8 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x15ac528, 0xc0000480c0}, {0x1428d5f?, 0x1f0a200?}, {0xc000627e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 57s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/jvh4vhillfq5a
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #527
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/527/display/redirect?page=changes>
Changes:
[Heejong Lee] [BEAM-14455] Add UUID to sub-schemas for PythonExternalTransform
[Heejong Lee] [BEAM-14430] Adding a logical type support for Python callables to Row
[Heejong Lee] add urn, type inference for PythonCallableSource
[Heejong Lee] fix lint errors
[Heejong Lee] move logical types def
[Heejong Lee] add micros_instant urn
[Heejong Lee] put a default type hint for PythonCallableSource
[Heejong Lee] add comment
[noreply] Revert "Better test assertion. (#17551)"
[noreply] Bump github.com/spf13/cobra from 1.3.0 to 1.4.0 in /sdks (#17647)
[noreply] [BEAM-14465] Reduce DefaultS3ClientBuilderFactory logging to debug level
[noreply] Merge pull request #17365 from [BEAM-12482] Update Schema Destination
[noreply] [BEAM-14014] Support impersonation credentials in dataflow runner
[noreply] [BEAM-14469] Allow nil primary returns from TrySplit in a single-window
[noreply] Add some auto-starting runners to the typescript SDK. (#17580)
[noreply] [BEAM-14371] (and BEAM-14372) - enable a couple staticchecks (#17670)
------------------------------------------
[...truncated 34.22 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/14 08:43:02 Using specified **** binary: 'linux_amd64/combine'
2022/05/14 08:43:02 Prepared job with id: load-tests-go-flink-batch-combine-1-0514065328_2ad90f02-c934-4af2-9ad6-8c17fc6bd155 and staging token: load-tests-go-flink-batch-combine-1-0514065328_2ad90f02-c934-4af2-9ad6-8c17fc6bd155
2022/05/14 08:43:05 Staged binary artifact with token:
2022/05/14 08:43:07 Submitted job: load0tests0go0flink0batch0combine0100514065328-root-0514084306-2ac1aee6_ede3eda5-0877-4603-9197-0ca404550a8d
2022/05/14 08:43:07 Job state: STOPPED
2022/05/14 08:43:07 Job state: STARTING
2022/05/14 08:43:07 Job state: RUNNING
2022/05/14 08:43:29 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 76ef9a1cb987d736cb3fd79205bb0de1)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.DataSourceTask.initInputFormat(DataSourceTask.java:324)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:106)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/14 08:43:29 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/05/14 08:43:29 Job state: FAILED
2022/05/14 08:43:29 Failed to execute job: job load0tests0go0flink0batch0combine0100514065328-root-0514084306-2ac1aee6_ede3eda5-0877-4603-9197-0ca404550a8d failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100514065328-root-0514084306-2ac1aee6_ede3eda5-0877-4603-9197-0ca404550a8d failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1109b38, 0xc000136000}, {0xfbc2e8?, 0x1829180?}, {0xc0005d7e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:80 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 40s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/7p7pljahzsuei
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #526
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/526/display/redirect?page=changes>
Changes:
[dannymccormick] [BEAM-14441] Add GitHub issue templates
[dannymccormick] Ask for beam version + other dependencies
[dannymccormick] We don't need outage
[dannymccormick] Cut p4
[chamikaramj] Updates CHANGES.md to include some recently discovered known issues
[noreply] [BEAM-14334] Fix leakage of SparkContext in Spark runner tests to remove
[noreply] Typo & link update (#17633)
[dannymccormick] Pare down to fewer templates
[noreply] Trigger go precommits on go mod/sum changes (#17636)
[noreply] Revert "[BEAM-14429] Force java load test on dataflow runner v2
[noreply] [BEAM-14347] Add generic registration feature to CHANGES (#17643)
[noreply] Better test assertion. (#17551)
[noreply] Bump github.com/google/go-cmp from 0.5.7 to 0.5.8 in /sdks (#17628)
[noreply] Bump github.com/testcontainers/testcontainers-go in /sdks (#17627)
[noreply] Bump github.com/lib/pq from 1.10.4 to 1.10.5 in /sdks (#17626)
[noreply] Merge pull request #17584 from [BEAM-14415] Exception handling tests and
[noreply] Bump cloud.google.com/go/pubsub from 1.18.0 to 1.21.1 in /sdks (#17646)
[noreply] Merge pull request #17408 from [BEAM-14312] [Website] change section
[noreply] Bump cloud.google.com/go/bigquery from 1.28.0 to 1.32.0 in /sdks
[noreply] [BEAM-14347] Add function for simple function registration (#17650)
[noreply] Drop dataclasses requirement, we only support python 3.7+ (#17640)
------------------------------------------
[...truncated 34.40 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/13 08:44:28 Using specified **** binary: 'linux_amd64/combine'
2022/05/13 08:44:29 Prepared job with id: load-tests-go-flink-batch-combine-1-0513065350_42e786a3-df2e-478e-810d-9c42a1bf48c5 and staging token: load-tests-go-flink-batch-combine-1-0513065350_42e786a3-df2e-478e-810d-9c42a1bf48c5
2022/05/13 08:44:32 Staged binary artifact with token:
2022/05/13 08:44:33 Submitted job: load0tests0go0flink0batch0combine0100513065350-root-0513084432-badabae2_6b7fa4f3-5a43-4998-97e2-2a0c00faf407
2022/05/13 08:44:33 Job state: STOPPED
2022/05/13 08:44:33 Job state: STARTING
2022/05/13 08:44:33 Job state: RUNNING
2022/05/13 08:44:51 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 09287a35f47837bd047aa0a520d860a5)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor24.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.DataSourceTask.initInputFormat(DataSourceTask.java:324)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:106)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/13 08:44:51 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/05/13 08:44:51 Job state: FAILED
2022/05/13 08:44:51 Failed to execute job: job load0tests0go0flink0batch0combine0100513065350-root-0513084432-badabae2_6b7fa4f3-5a43-4998-97e2-2a0c00faf407 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100513065350-root-0513084432-badabae2_6b7fa4f3-5a43-4998-97e2-2a0c00faf407 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1109b98, 0xc000134000}, {0xfbc348?, 0x1828180?}, {0xc00065be70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:80 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 10s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/hms24xxkj4vqa
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #525
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/525/display/redirect?page=changes>
Changes:
[randomstep] [BEAM-14096] bump junit-quickcheck to 1.0
[noreply] [BEAM-5492] Python Dataflow integration tests should export the pipeline
[noreply] [BEAM-14396] Bump httplib2 upper bound. (#17602)
[noreply] [BEAM-11104] Add self-checkpointing to CHANGES.md (#17612)
[noreply] [BEAM-14081] [CdapIO] Add context classes for CDAP plugins (#17104)
[noreply] [BEAM-12526] Add Dependabot (#17563)
[noreply] Remove python 3.6 postcommit from mass_comment.py (#17630)
[noreply] [BEAM-14347] Add some benchmarks for generic registration (#17613)
[noreply] Correctly route go dependency changes to go label (#17632)
[noreply] [BEAM-13695] Add jamm jvm options to Java 11 (#17178)
------------------------------------------
Started by timer
Running as SYSTEM
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-2 (beam) in workspace <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/>
The recommended git tool is: NONE
No credentials specified
Wiping out workspace first.
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
> git init <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src> # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
> git --version # timeout=10
> git --version # 'git version 2.25.1'
> git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/apache/beam.git # timeout=10
> git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
> git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
> git fetch --tags --force --progress -- https://github.com/apache/beam.git +refs/heads/*:refs/remotes/origin/* +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/* # timeout=10
> git rev-parse origin/master^{commit} # timeout=10
Checking out Revision a1674241c6895302d6225c92c0a8f40956a24c04 (origin/master)
> git config core.sparsecheckout # timeout=10
> git checkout -f a1674241c6895302d6225c92c0a8f40956a24c04 # timeout=10
Commit message: "[BEAM-13695] Add jamm jvm options to Java 11 (#17178)"
> git rev-list --no-walk df67c817184acafc4adbc56a725eed4985053171 # timeout=10
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content
SPARK_LOCAL_IP=127.0.0.1
SETUPTOOLS_USE_DISTUTILS=stdlib
[EnvInject] - Variables injected successfully.
[EnvInject] - Injecting environment variables from a build step.
[EnvInject] - Injecting as environment variables the properties content
JOB_SERVER_IMAGE=gcr.io/apache-beam-testing/beam_portability/beam_flink1.12_job_server:latest
CLUSTER_NAME=beam-loadtests-go-combine-flink-batch-525
DETACHED_MODE=true
HARNESS_IMAGES_TO_PULL=gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest
FLINK_NUM_WORKERS=5
FLINK_DOWNLOAD_URL=https://archive.apache.org/dist/flink/flink-1.12.3/flink-1.12.3-bin-scala_2.11.tgz
GCS_BUCKET=gs://beam-flink-cluster
HADOOP_DOWNLOAD_URL=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar
FLINK_TASKMANAGER_SLOTS=1
ARTIFACTS_DIR=gs://beam-flink-cluster/beam-loadtests-go-combine-flink-batch-525
GCLOUD_ZONE=us-central1-a
[EnvInject] - Variables injected successfully.
[beam_LoadTests_Go_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins1439940870564900522.sh
+ echo Setting up flink cluster
Setting up flink cluster
[beam_LoadTests_Go_Combine_Flink_Batch] $ /bin/bash -xe /tmp/jenkins5004751438133050368.sh
+ cd <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/.test-infra/dataproc>
+ ./flink_cluster.sh create
+ GCLOUD_ZONE=us-central1-a
+ DATAPROC_VERSION=2.0
+ GCLOUD_REGION=us-central1
+ MASTER_NAME=beam-loadtests-go-combine-flink-batch-525-m
+ INIT_ACTIONS_FOLDER_NAME=init-actions
+ FLINK_INIT=gs://beam-flink-cluster/init-actions/flink.sh
+ BEAM_INIT=gs://beam-flink-cluster/init-actions/beam.sh
+ DOCKER_INIT=gs://beam-flink-cluster/init-actions/docker.sh
+ FLINK_LOCAL_PORT=8081
+ FLINK_TASKMANAGER_SLOTS=1
+ YARN_APPLICATION_MASTER=
+ create
+ upload_init_actions
+ echo 'Uploading initialization actions to GCS bucket: gs://beam-flink-cluster'
Uploading initialization actions to GCS bucket: gs://beam-flink-cluster
+ gsutil cp -r init-actions/beam.sh init-actions/docker.sh init-actions/flink.sh gs://beam-flink-cluster/init-actions
Copying file://init-actions/beam.sh [Content-Type=text/x-sh]...
/ [0 files][ 0.0 B/ 2.3 KiB] / [1 files][ 2.3 KiB/ 2.3 KiB] Copying file://init-actions/docker.sh [Content-Type=text/x-sh]...
/ [1 files][ 2.3 KiB/ 6.0 KiB] / [2 files][ 6.0 KiB/ 6.0 KiB] Copying file://init-actions/flink.sh [Content-Type=text/x-sh]...
/ [2 files][ 6.0 KiB/ 13.5 KiB] / [3 files][ 13.5 KiB/ 13.5 KiB] -
Operation completed over 3 objects/13.5 KiB.
+ create_cluster
+ local metadata=flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.12.3/flink-1.12.3-bin-scala_2.11.tgz,
+ metadata+=flink-start-yarn-session=true,
+ metadata+=flink-taskmanager-slots=1,
+ metadata+=hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest ]]
+ metadata+=,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.12_job_server:latest ]]
+ metadata+=,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.12_job_server:latest
+ local image_version=2.0
+ echo 'Starting dataproc cluster. Dataproc version: 2.0'
Starting dataproc cluster. Dataproc version: 2.0
+ local num_dataproc_****s=6
+ gcloud dataproc clusters create beam-loadtests-go-combine-flink-batch-525 --region=us-central1 --num-****s=6 --metadata flink-snapshot-url=https://archive.apache.org/dist/flink/flink-1.12.3/flink-1.12.3-bin-scala_2.11.tgz,flink-start-yarn-session=true,flink-taskmanager-slots=1,hadoop-jar-url=https://repo.maven.apache.org/maven2/org/apache/flink/flink-shaded-hadoop-2-uber/2.8.3-10.0/flink-shaded-hadoop-2-uber-2.8.3-10.0.jar,beam-sdk-harness-images-to-pull=gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest,beam-job-server-image=gcr.io/apache-beam-testing/beam_portability/beam_flink1.12_job_server:latest, --image-version=2.0 --zone=us-central1-a --optional-components=FLINK,DOCKER --quiet
Waiting on operation [projects/apache-beam-testing/regions/us-central1/operations/4b74fdc6-ad03-3b24-b96f-b6ed350e7b5a].
Waiting for cluster creation operation...
....................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................done.
Created [https://dataproc.googleapis.com/v1/projects/apache-beam-testing/regions/us-central1/clusters/beam-loadtests-go-combine-flink-batch-525] Cluster placed in zone [us-central1-a].
+ get_leader
+ local i=0
+ local application_ids
+ local application_masters
+ echo 'Yarn Applications'
Yarn Applications
++ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-go-combine-flink-batch-525-m '--command=yarn application -list'
++ grep beam-loadtests-go-combine-flink-batch-525
Warning: Permanently added 'compute.8366466622123243241' (ECDSA) to the list of known hosts.
2022-05-12 08:42:37,129 INFO client.RMProxy: Connecting to ResourceManager at beam-loadtests-go-combine-flink-batch-525-m/10.128.1.19:8032
2022-05-12 08:42:37,585 INFO client.AHSProxy: Connecting to Application History server at beam-loadtests-go-combine-flink-batch-525-m/10.128.1.19:10200
+ read line
+ echo application_1652344896809_0002 flink-dataproc Apache Flink root default RUNNING UNDEFINED 100% http://beam-loadtests-go-combine-flink-batch-525-w-4.c.apache-beam-testing.internal:36015
application_1652344896809_0002 flink-dataproc Apache Flink root default RUNNING UNDEFINED 100% http://beam-loadtests-go-combine-flink-batch-525-w-4.c.apache-beam-testing.internal:36015
++ echo application_1652344896809_0002 flink-dataproc Apache Flink root default RUNNING UNDEFINED 100% http://beam-loadtests-go-combine-flink-batch-525-w-4.c.apache-beam-testing.internal:36015
++ sed 's/ .*//'
+ application_ids[$i]=application_1652344896809_0002
++ echo application_1652344896809_0002 flink-dataproc Apache Flink root default RUNNING UNDEFINED 100% http://beam-loadtests-go-combine-flink-batch-525-w-4.c.apache-beam-testing.internal:36015
++ sed 's/.*beam-loadtests-go-combine-flink-batch-525/beam-loadtests-go-combine-flink-batch-525/'
++ sed 's/ .*//'
+ application_masters[$i]=beam-loadtests-go-combine-flink-batch-525-w-4.c.apache-beam-testing.internal:36015
+ i=1
+ read line
+ '[' 1 '!=' 1 ']'
+ YARN_APPLICATION_MASTER=beam-loadtests-go-combine-flink-batch-525-w-4.c.apache-beam-testing.internal:36015
+ echo 'Using Yarn Application master: beam-loadtests-go-combine-flink-batch-525-w-4.c.apache-beam-testing.internal:36015'
Using Yarn Application master: beam-loadtests-go-combine-flink-batch-525-w-4.c.apache-beam-testing.internal:36015
+ [[ -n gcr.io/apache-beam-testing/beam_portability/beam_flink1.12_job_server:latest ]]
+ start_job_server
+ gcloud compute ssh --zone=us-central1-a --quiet yarn@beam-loadtests-go-combine-flink-batch-525-m '--command=sudo --user yarn docker run --detach --publish 8099:8099 --publish 8098:8098 --publish 8097:8097 --volume ~/.config/gcloud:/root/.config/gcloud gcr.io/apache-beam-testing/beam_portability/beam_flink1.12_job_server:latest --flink-master=beam-loadtests-go-combine-flink-batch-525-w-4.c.apache-beam-testing.internal:36015 --artifacts-dir=gs://beam-flink-cluster/beam-loadtests-go-combine-flink-batch-525'
docker: Got permission denied while trying to connect to the Docker daemon socket at unix:///var/run/docker.sock: Post http://%2Fvar%2Frun%2Fdocker.sock/v1.40/containers/create: dial unix /var/run/docker.sock: connect: permission denied.
See 'docker run --help'.
Build step 'Execute shell' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #524
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/524/display/redirect?page=changes>
Changes:
[Alexey Romanenko] [BEAM-12918] Add PostCommit_Java_Tpcds_Spark job
[johnjcasey] [BEAM-14448] add datastore test
[yathu] [BEAM-14423] Add test cases for BigtableIO.BigtableWriterFn fails due to
[Pablo Estrada] Revert "Merge pull request #17517 from [BEAM-14383] Improve "FailedRows"
[noreply] [BEAM-14229] Fix SyntheticUnboundedSource duplication from checkpoint
[noreply] [BEAM-14347] Rename registration package to register (#17603)
[noreply] [BEAM-11104] Add self-checkpointing integration test (#17590)
------------------------------------------
[...truncated 34.43 KB...]
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/11 08:43:21 Using specified **** binary: 'linux_amd64/combine'
2022/05/11 08:43:22 Prepared job with id: load-tests-go-flink-batch-combine-1-0511065326_f3014df8-65a9-49f0-a3b0-57b6fe149edf and staging token: load-tests-go-flink-batch-combine-1-0511065326_f3014df8-65a9-49f0-a3b0-57b6fe149edf
2022/05/11 08:43:25 Staged binary artifact with token:
2022/05/11 08:43:26 Submitted job: load0tests0go0flink0batch0combine0100511065326-root-0511084325-c38fa015_fe9a42e4-b624-4997-bd31-0f391210ec49
2022/05/11 08:43:26 Job state: STOPPED
2022/05/11 08:43:26 Job state: STARTING
2022/05/11 08:43:26 Job state: RUNNING
2022/05/11 08:43:46 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 1fef93005931f3426fe1705010e5fa6a)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1477)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/11 08:43:46 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/05/11 08:43:46 Job state: FAILED
2022/05/11 08:43:46 Failed to execute job: job load0tests0go0flink0batch0combine0100511065326-root-0511084325-c38fa015_fe9a42e4-b624-4997-bd31-0f391210ec49 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100511065326-root-0511084325-c38fa015_fe9a42e4-b624-4997-bd31-0f391210ec49 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1103058, 0xc00012e000}, {0xfb671e?, 0x181d408?}, {0xc0000f5e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:80 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 41s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/uds2cplejnhra
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #523
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/523/display/redirect?page=changes>
Changes:
[elias.segundo] Changing elegibility to AllNodeElegibility
[andyye333] Add extra details to PubSub matcher errors
[noreply] Merge pull request #17559 from [BEAM-14423] Add exception injection
[noreply] [BEAM-11104] Allow self-checkpointing SDFs to return without finishing
[noreply] Merge pull request #17544 from [BEAM-14415] Exception handling tests for
[noreply] Merge pull request #17565 from [BEAM-14413] add Kafka exception test
[noreply] Merge pull request #17555 from [BEAM-14417] Adding exception handling
[noreply] [BEAM-14433] Improve Go split error message. (#17575)
[noreply] [BEAM-14429] Force java load test on dataflow runner v2
[noreply] Merge pull request #17577 from [BEAM-14435] Adding exception handling
[noreply] [BEAM-14347] Add generic registration functions for iters and emitters
[noreply] [BEAM-14169] Add Credentials rotation cron job for clusters (#17383)
[noreply] [BEAM-14347] Add generic registration for accumulators (#17579)
------------------------------------------
[...truncated 34.50 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/10 08:43:51 Using specified **** binary: 'linux_amd64/combine'
2022/05/10 08:43:52 Prepared job with id: load-tests-go-flink-batch-combine-1-0510065321_5298bd64-989d-4322-becb-e1247d6e94f2 and staging token: load-tests-go-flink-batch-combine-1-0510065321_5298bd64-989d-4322-becb-e1247d6e94f2
2022/05/10 08:43:55 Staged binary artifact with token:
2022/05/10 08:43:56 Submitted job: load0tests0go0flink0batch0combine0100510065321-root-0510084355-f04cddb3_b0804992-1d67-4dd8-b55c-50892efc6595
2022/05/10 08:43:56 Job state: STOPPED
2022/05/10 08:43:56 Job state: STARTING
2022/05/10 08:43:56 Job state: RUNNING
2022/05/10 08:44:14 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: ab1bfad14c20f0ff7ad3d649e9af855d)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor29.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.DataSourceTask.initInputFormat(DataSourceTask.java:324)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:106)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/10 08:44:14 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/05/10 08:44:14 Job state: FAILED
2022/05/10 08:44:14 Failed to execute job: job load0tests0go0flink0batch0combine0100510065321-root-0510084355-f04cddb3_b0804992-1d67-4dd8-b55c-50892efc6595 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100510065321-root-0510084355-f04cddb3_b0804992-1d67-4dd8-b55c-50892efc6595 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1103058, 0xc00012e000}, {0xfb671e?, 0x181d408?}, {0xc0006d7e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:80 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 50s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/nbekxj4rlaroo
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #522
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/522/display/redirect?page=changes>
Changes:
[chamikaramj] Adds code reviewers for GCP I/O connectors and KafkaIO to Beam OWNERS
------------------------------------------
[...truncated 34.52 KB...]
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/09 08:43:06 Using specified **** binary: 'linux_amd64/combine'
2022/05/09 08:43:07 Prepared job with id: load-tests-go-flink-batch-combine-1-0509065315_92870cc2-650d-4a7f-989d-28484735e629 and staging token: load-tests-go-flink-batch-combine-1-0509065315_92870cc2-650d-4a7f-989d-28484735e629
2022/05/09 08:43:10 Staged binary artifact with token:
2022/05/09 08:43:10 Submitted job: load0tests0go0flink0batch0combine0100509065315-root-0509084310-84731015_aec1407e-8a69-4d3e-80c1-3c1a77807b5b
2022/05/09 08:43:11 Job state: STOPPED
2022/05/09 08:43:11 Job state: STARTING
2022/05/09 08:43:11 Job state: RUNNING
2022/05/09 08:43:30 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: e853f3cd6d31fd4fcf0822c33f5beb37)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
at java.util.ServiceLoader.fail(ServiceLoader.java:239)
at java.util.ServiceLoader.access$300(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:376)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at com.fasterxml.jackson.databind.ObjectMapper.findModules(ObjectMapper.java:1081)
at org.apache.beam.runners.core.construction.SerializablePipelineOptions.<clinit>(SerializablePipelineOptions.java:38)
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1477)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/09 08:43:30 (): java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
2022/05/09 08:43:31 Job state: FAILED
2022/05/09 08:43:31 Failed to execute job: job load0tests0go0flink0batch0combine0100509065315-root-0509084310-84731015_aec1407e-8a69-4d3e-80c1-3c1a77807b5b failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100509065315-root-0509084310-84731015_aec1407e-8a69-4d3e-80c1-3c1a77807b5b failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1102ff8, 0xc0000480c0}, {0xfb671e?, 0x181d408?}, {0xc00029be70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:80 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 38s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/jbco7nbf7w6dk
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #521
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/521/display/redirect>
Changes:
------------------------------------------
[...truncated 34.53 KB...]
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/08 08:43:18 Using specified **** binary: 'linux_amd64/combine'
2022/05/08 08:43:18 Prepared job with id: load-tests-go-flink-batch-combine-1-0508065312_01540d16-bb2b-4392-893c-5e21c12e1dda and staging token: load-tests-go-flink-batch-combine-1-0508065312_01540d16-bb2b-4392-893c-5e21c12e1dda
2022/05/08 08:43:21 Staged binary artifact with token:
2022/05/08 08:43:22 Submitted job: load0tests0go0flink0batch0combine0100508065312-root-0508084321-5da59584_d0d96e80-1847-485e-ab9f-681f5c1ba5a4
2022/05/08 08:43:22 Job state: STOPPED
2022/05/08 08:43:22 Job state: STARTING
2022/05/08 08:43:22 Job state: RUNNING
2022/05/08 08:43:43 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: c8dfb498ab627bdcf01a6cdae06e0e5b)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor25.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
at java.util.ServiceLoader.fail(ServiceLoader.java:239)
at java.util.ServiceLoader.access$300(ServiceLoader.java:185)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:376)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at com.fasterxml.jackson.databind.ObjectMapper.findModules(ObjectMapper.java:1081)
at org.apache.beam.runners.core.construction.SerializablePipelineOptions.<clinit>(SerializablePipelineOptions.java:38)
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1477)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/08 08:43:43 (): java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.module.jaxb.JaxbAnnotationModule not a subtype
2022/05/08 08:43:43 Job state: FAILED
2022/05/08 08:43:43 Failed to execute job: job load0tests0go0flink0batch0combine0100508065312-root-0508084321-5da59584_d0d96e80-1847-485e-ab9f-681f5c1ba5a4 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100508065312-root-0508084321-5da59584_d0d96e80-1847-485e-ab9f-681f5c1ba5a4 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1102ff8, 0xc0000480c0}, {0xfb671e?, 0x181d408?}, {0xc000631e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:80 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 40s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/e4pvbpl35nshs
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #520
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/520/display/redirect?page=changes>
Changes:
[kevinsijo] Setting up a basic directory
[kevinsijo] Mirroring Python SDK's directory structure
[kerrydc] Adds initial tests
[kevinsijo] 'runners' is the correct directory name
[Pablo Estrada] sketching the core API for JS SDK
[jonathanlui] add .gitignore for node/ts project
[Robert Bradshaw] Worker directory.
[Robert Bradshaw] Fix complile errors with explicit any for callables.
[Robert Bradshaw] Add worker entry point.
[Robert Bradshaw] Add proto generation code.
[Robert Bradshaw] Add generated proto files.
[Robert Bradshaw] Attempts to get ts protos to compile.
[Robert Bradshaw] Exclude ts protos for now.
[Robert Bradshaw] More changes to get ts protos working.
[Robert Bradshaw] Update scripts and config to get protos compiling.
[Robert Bradshaw] Update geenrated files.
[jonathanlui] add build and clean script to compile ts
[Robert Bradshaw] Generate server for loopback worker.
[Robert Bradshaw] Generated grpc servers for loopback.
[Robert Bradshaw] Add typescript formatter.
[Robert Bradshaw] Loopback server (that does nothing).
[Robert Bradshaw] Working server.
[Pablo Estrada] Starting expansion of primitive transforms
[Pablo Estrada] Starting to implement and support standard coders
[Robert Bradshaw] Also generate grpc clients.
[Robert Bradshaw] Basic implementation of worker harness.
[Pablo Estrada] fix the build
[Robert Bradshaw] Add some missing files for worker harness.
[Robert Bradshaw] Refactor operators to use registration.
[jonathanlui] enable ts in mocha
[jonathanlui] update readme
[jonathanlui] --save-dev @types/mocha
[jonathanlui] translate core_test.js to typescript
[Robert Bradshaw] Encapsulate worker service in a class.
[Kenneth Knowles] Port standard_coders_test to typescript (superficially)
[Pablo Estrada] Starting the proto translation of Impulse, ParDo, GBK
[Robert Bradshaw] Add some tests for the worker code.
[Robert Bradshaw] Fixing old lock file error.
[Pablo Estrada] Adding transform names and fixing GBK coder issue
[Robert Bradshaw] npx tsfmt -r src/apache_beam/base.ts src/apache_beam/transforms/core.ts
[Kenneth Knowles] switch to import style require() statements
[Kenneth Knowles] Add Coder interface using protobufjs classes
[Kenneth Knowles] BytesCoder with some failures
[noreply] Added GeneralObjectCoder and using it as coder for most transforms (#9)
[Kenneth Knowles] Fix order of arguments to deepEqual
[Kenneth Knowles] Encode expected encoding as binary
[Robert Bradshaw] Refactor API to allow for composites.
[jrmccluskey] Initial setup for automated Java expansion startup
[jrmccluskey] Update exp_service.ts
[Kenneth Knowles] Fix up coder deserialization
[Robert Bradshaw] Simplify GBK coder computation.
[Robert Bradshaw] Remove top-level PValue.
[Pablo Estrada] Make tests green
[Robert Bradshaw] Rename PValueish to PValue.
[jonathanlui] node runner
[jonathanlui] whitespaces
[Robert Bradshaw] Make Runner.run async.
[jonathanlui] bson and fast-deep-equal should not be listed as devdependency
[jrmccluskey] Add basic Dockerfile that starts ExternalWorkerPool
[Robert Bradshaw] Direct runner.
[kevinsijo] Testing expansion service communication
[Robert Bradshaw] Added flatten, assertion checkers.
[Pablo Estrada] progress on basic coders
[Robert Bradshaw] Fixing the build.
[Robert Bradshaw] Cleanup, simplify access.
[Pablo Estrada] Adding limited support for KVCoder and IterableCoder
[Robert Bradshaw] Introduce PipelineContext.
[Robert Bradshaw] Add toProto to all coders.
[Robert Bradshaw] Some work with coders.
[Robert Bradshaw] Remove debug logging.
[Robert Bradshaw] Use coders over data channel.
[Kenneth Knowles] explicitly sequence sub-coder serializations
[Kenneth Knowles] no more need to extend FakeCoder
[Kenneth Knowles] actually advance reader
[Kenneth Knowles] autoformat
[Kenneth Knowles] protobufjs already can write and read signed varints
[Kenneth Knowles] with improved test harness, kv has many more failures
[Kenneth Knowles] read bytescoder from correct position
[Kenneth Knowles] no more fake coders
[Kenneth Knowles] varint examples all work
[Kenneth Knowles] simplify coder value parsing
[Kenneth Knowles] global window coder
[Kenneth Knowles] fix swapEndian32
[Robert Bradshaw] Add P(...) operator.
[kevinsijo] Implementing RowCoder encoding.
[jrmccluskey] remove unused container dir
[kevinsijo] Corrected sorting of encoded positions to reflect an argsort instead.
[Robert Bradshaw] Populate environments.
[kevinsijo] Implementing RowCoder decoding.
[Kenneth Knowles] preliminary unbounded iterable coder
[Kenneth Knowles] friendlier description of standard coder test case
[Kenneth Knowles] fix test harness; iterable works
[jrmccluskey] first pass at boot.go
[jonathanlui] update package-lock.json
[jonathanlui] make NodeRunner a subclass of Runner
[jonathanlui] add waitUntilFinish interface member
[Pablo Estrada] Adding double coder
[Kenneth Knowles] scaffolding for windowed values
[Pablo Estrada] Adding type information to PColleciton and PTransform
[jonathanlui] fix direct runner
[Pablo Estrada] Adding typing information for DoFns
[Kenneth Knowles] add interval window
[Robert Bradshaw] Export PValue.
[Robert Bradshaw] Add CombineFn interface.
[Robert Bradshaw] Typed flatten.
[jonathanlui] add runAsync method to base.Runner
[Kenneth Knowles] add Long package
[Pablo Estrada] Adding more types. Making PValue typed
[Kenneth Knowles] instant coder draft
[Robert Bradshaw] Return job state from direct runner.
[Kenneth Knowles] type instant = long
[jonathanlui] implement NodeRunner.runPipeline
[Kenneth Knowles] autoformat
[kevinsijo] Completed implementation of basic row coder
[Kenneth Knowles] Fix IntervalWindowCoder, almost
[Kenneth Knowles] fix interval window coder
[Kenneth Knowles] autoformat
[Robert Bradshaw] loopback runner works
[Kenneth Knowles] move core element types into values.ts
[Kenneth Knowles] just build object directly to be cool
[Robert Bradshaw] GBK working on ULR.
[Robert Bradshaw] Async transforms.
[Robert Bradshaw] External transform grpah splicing.
[Kenneth Knowles] progress on windowed value: paneinfo encoding
[Robert Bradshaw] Fix merge.
[Robert Bradshaw] autoformat
[Kenneth Knowles] full windowed value coder
[kerrydc] Updates tests to use correct types, adds generics where needed to DoFns
[Robert Bradshaw] Add serialization librarires.'
[Robert Bradshaw] Add Split() PTransform, for producing multiple outputs from a single
[Robert Bradshaw] Schema-encoded external payloads.
[kevinsijo] Adding Schema inference from JSON
[Pablo Estrada] Removing unused directories
[Pablo Estrada] Support for finishBundle and improving typing annotations.
[Pablo Estrada] A base implementation of combiners with GBK/ParDo
[Robert Bradshaw] Fully propagate windowing information in both remote and direct runner.
[Robert Bradshaw] Make args and kwargs optional for python external transform.
[Robert Bradshaw] Infer schema for external transforms.
[Pablo Estrada] Implementing a custom combine fn as an example. Small fixes
[Robert Bradshaw] Fix missing windowing information in combiners.
[Robert Bradshaw] PostShuffle needn't group by key as that's already done.
[Robert Bradshaw] Guard pre-combine for global window only.
[Robert Bradshaw] WindowInto
[Robert Bradshaw] Fix optional kwargs.
[Robert Bradshaw] A couple of tweaks for js + py
[Robert Bradshaw] Add windowing file.
[Robert Bradshaw] CombineBy transform, stand-alone WordCount.
[Robert Bradshaw] cleanup
[Robert Bradshaw] Actually fix optional external kwargs.
[Robert Bradshaw] Demo2, textio read.
[Robert Bradshaw] Add command lines for starting up the servers.
[Robert Bradshaw] Run prettier on the full codebase.
[Robert Bradshaw] Update deps.
[Pablo Estrada] Adding docstrings for core.ts. Prettier dependency
[Pablo Estrada] Documenting coder interfaces
[Pablo Estrada] Added documentation for a few standard coders
[Robert Bradshaw] Unified grouping and combining.
[Robert Bradshaw] Allow PCollection ids to be lazy.
[Robert Bradshaw] Reorganize module structure.
[Robert Bradshaw] A couple more renames.
[Robert Bradshaw] Simplify.
[Robert Bradshaw] Consolidation.
[Robert Bradshaw] Fix build.
[Robert Bradshaw] Add optional context to ParDo.
[Robert Bradshaw] fixup: iterable coder endian sign issue
[Robert Bradshaw] omit context for map(console.log)
[Robert Bradshaw] Fix ReadFromText coders.
[Robert Bradshaw] Flesh out README with overview and current state.
[noreply] Readme typo
[Robert Bradshaw] Two more TODOs.
[noreply] Add a pointer to the example wordcount to the readme.
[Pablo Estrada] Documenting coders and implementing unknown-length method
[Robert Bradshaw] UIID dependency.
[Robert Bradshaw] Artifact handling.
[Robert Bradshaw] Properly wait on data channel for bundle completion.
[Robert Bradshaw] Automatic java expansion service startup.
[Robert Bradshaw] Process promises.
[Robert Bradshaw] Implement side inputs.
[Robert Bradshaw] Cleanup.
[Robert Bradshaw] Put complex constext stuff in its own file.
[Robert Bradshaw] Rename BoundedWindow to just Window.
[Robert Bradshaw] Alternative splitter class.
[Pablo Estrada] Documenting internal functions
[Robert Bradshaw] Take a pass clarifying the TODOs.
[Robert Bradshaw] Sql transform wrapper.
[Robert Bradshaw] Incorporate some feedback into the TODOs.
[Robert Bradshaw] More TODOs.
[Robert Bradshaw] Remove app placeholder.
[Robert Bradshaw] Apache license headers.
[Robert Bradshaw] More TODOs
[jankuehle] Suggestions for TypeScript todos
[dannymccormick] Add actions for typescript sdk
[dannymccormick] Fix test command
[noreply] Add missing version
[dannymccormick] Fix codecovTest command
[noreply] Only do prettier check on linux
[noreply] Only get codecov on linux
[Robert Bradshaw] Resolve some comments.
[Robert Bradshaw] Fix compile errors.
[Robert Bradshaw] Prettier.
[Robert Bradshaw] Re-order expandInternal arguments pending unification.
[Robert Bradshaw] More consistent and stricter PTransform naming.
[Robert Bradshaw] Notes on explicit, if less idiomatic, use of classes.
[Robert Bradshaw] Let DoFn be an interface rather than a class.
[Robert Bradshaw] Provide DoFn context to start and finish bundle.
[Robert Bradshaw] Optional promise code simplification.
[Robert Bradshaw] Cleanup todos.
[Robert Bradshaw] Avoid any type where not needed.
[Robert Bradshaw] Apache RAT excludes for typescript.
[Robert Bradshaw] Remove empty READMEs.
[Robert Bradshaw] Add licences statement to readme files.
[Robert Bradshaw] More RAT fixes.
[Robert Bradshaw] Another unsupported coder.
[Robert Bradshaw] Remove debugging code.
[noreply] Fix automatic naming with code coverage.
[Robert Bradshaw] Coders cleanup.
[Robert Bradshaw] Add tests for RowCoder.
[Robert Bradshaw] Normalize capitalization, comments.
[Robert Bradshaw] Install typescript closure packages.
[Robert Bradshaw] npm audit fix
[Robert Bradshaw] Move more imports out of base.
[Robert Bradshaw] Changes needed to compile with ts closure plugin.
[Robert Bradshaw] Use ttsc and ts-closure-transform plugin.
[Robert Bradshaw] Serialization registration to actually get serialization working.
[Robert Bradshaw] Container images working on local runner.
[Robert Bradshaw] Add a portable job server that proxies the Dataflow backend. (#17189)
[Robert Bradshaw] Improvements to dataflow job service for non-Python jobs.
[Robert Bradshaw] Get dataflow working.
[Robert Bradshaw] User friendly pipeline options.
[Robert Bradshaw] Less classes, more functions.
[Robert Bradshaw] Add new nullable standard coder.
[Robert Bradshaw] Make Apache Rat happy.
[Robert Bradshaw] Disable broken codecov.
[Robert Bradshaw] Remove last uses of base.ts.
[Robert Bradshaw] Remove unneedd file.
[Robert Bradshaw] Remove more uneeded/unused files.
[Robert Bradshaw] Cleanup tests.
[Robert Bradshaw] Minor cleanups to coder tests.
[noreply] Quote pip install package name
[noreply] [BEAM-14374] Fix module import error in FullyQualifiedNamedTransform
[Robert Bradshaw] Addressing issues from the review.
[noreply] Apply suggestions from code review.
[Robert Bradshaw] Post-merge fixes.
[dannymccormick] Delete tags.go
[Robert Bradshaw] Update tests to use our actual serialization libraries.
[Robert Bradshaw] Another pass at TODOs, removing finished items.
[Heejong Lee] [BEAM-14146] Python Streaming job failing to drain with BigQueryIO write
[Kenneth Knowles] Add parameter for service account impersonation in GCP credentials
[Heejong Lee] add test
[noreply] Merge pull request #17490 from [BEAM-14370] [Website] Add new page about
[noreply] [BEAM-14332] Refactored cluster management for Flink on Dataproc
[noreply] [BEAM-13988] Update mtime to use time.UnixMilli() calls (#17578)
[noreply] Fixing patching error on missing dependencies (#17564)
[noreply] Merge pull request #17517 from [BEAM-14383] Improve "FailedRows" errors
[Heejong Lee] add test without mock
------------------------------------------
[...truncated 34.33 KB...]
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/07 08:43:26 Using specified **** binary: 'linux_amd64/combine'
2022/05/07 08:43:27 Prepared job with id: load-tests-go-flink-batch-combine-1-0507065313_6b09a5d2-1950-44b8-adb6-ee076ef597d2 and staging token: load-tests-go-flink-batch-combine-1-0507065313_6b09a5d2-1950-44b8-adb6-ee076ef597d2
2022/05/07 08:43:30 Staged binary artifact with token:
2022/05/07 08:43:31 Submitted job: load0tests0go0flink0batch0combine0100507065313-root-0507084330-f9de427d_f5d29da7-69bc-4c21-bd3d-d48dea6b95e8
2022/05/07 08:43:31 Job state: STOPPED
2022/05/07 08:43:31 Job state: STARTING
2022/05/07 08:43:31 Job state: RUNNING
2022/05/07 08:43:53 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 68bdb93a0952f28461c52d09b6843c31)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getInputSerializer(TaskConfig.java:459)
at org.apache.flink.runtime.operators.DataSinkTask.initInputReaders(DataSinkTask.java:413)
at org.apache.flink.runtime.operators.DataSinkTask.invoke(DataSinkTask.java:117)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/07 08:43:53 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/05/07 08:43:53 Job state: FAILED
2022/05/07 08:43:53 Failed to execute job: job load0tests0go0flink0batch0combine0100507065313-root-0507084330-f9de427d_f5d29da7-69bc-4c21-bd3d-d48dea6b95e8 failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100507065313-root-0507084330-f9de427d_f5d29da7-69bc-4c21-bd3d-d48dea6b95e8 failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1102ff8, 0xc0000480c0}, {0xfb671e?, 0x181d408?}, {0xc0005c7e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:80 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 52s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/esu5wa4wn2id2
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #519
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/519/display/redirect?page=changes>
Changes:
[zyichi] Move master readme.md to 2.40.0
[noreply] [BEAM-14173] Fix Go Loadtests on Dataflow & partial fix for Flink
[noreply] Upgrade python sdk container requirements. (#17549)
[noreply] Merge pull request #17497: [BEAM-11205] Update GCP Libraries BOM version
[noreply] [BEAM-12603] Add retry on grpc data channel and remove retry from test.
[noreply] Merge pull request #17359: [BEAM-14303] Add a way to exclude output
[noreply] [BEAM-14347] Allow users to optimize DoFn execution with a single
[noreply] [BEAM-5878] Add (failing) kwonly-argument test (#17509)
------------------------------------------
[...truncated 34.39 KB...]
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/06 08:43:23 Using specified **** binary: 'linux_amd64/combine'
2022/05/06 08:43:24 Prepared job with id: load-tests-go-flink-batch-combine-1-0506065319_94f244d7-2cbb-4dc1-97a1-f31af12ca1d2 and staging token: load-tests-go-flink-batch-combine-1-0506065319_94f244d7-2cbb-4dc1-97a1-f31af12ca1d2
2022/05/06 08:43:27 Staged binary artifact with token:
2022/05/06 08:43:28 Submitted job: load0tests0go0flink0batch0combine0100506065319-root-0506084327-cf6febed_feeda6b8-845b-41fe-91ad-7450c2de881d
2022/05/06 08:43:28 Job state: STOPPED
2022/05/06 08:43:28 Job state: STARTING
2022/05/06 08:43:28 Job state: RUNNING
2022/05/06 08:43:50 (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: d36e2b15a43bc67d6775ee6ea8f3faa5)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
at sun.reflect.GeneratedMethodAccessor25.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:517)
at akka.actor.Actor.aroundReceive$(Actor.scala:515)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
at java.security.AccessController.doPrivileged(Native Method)
at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
at org.apache.flink.runtime.operators.BatchTask.getOutputCollector(BatchTask.java:1362)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1477)
at org.apache.flink.runtime.operators.BatchTask.initOutputs(BatchTask.java:1132)
at org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:245)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
at java.lang.Thread.run(Thread.java:750)
2022/05/06 08:43:50 (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
2022/05/06 08:43:50 Job state: FAILED
2022/05/06 08:43:50 Failed to execute job: job load0tests0go0flink0batch0combine0100506065319-root-0506084327-cf6febed_feeda6b8-845b-41fe-91ad-7450c2de881d failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100506065319-root-0506084327-cf6febed_feeda6b8-845b-41fe-91ad-7450c2de881d failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1102ff8, 0xc0001a6000}, {0xfb671e?, 0x181d428?}, {0xc00016de70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:80 +0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 50s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/2m3maxenmje3o
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org
Build failed in Jenkins: beam_LoadTests_Go_Combine_Flink_Batch #518
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/518/display/redirect?page=changes>
Changes:
[Heejong Lee] [BEAM-9245] Unable to pull datatore Entity which contains dict
[bulat.safiullin] [BEAM-14382] [Website] add banner container for with css, images, html
[Jan Lukavský] [BEAM-14196] add test verifying output watermark propagation in bundle
[Jan Lukavský] [BEAM-14196] Fix FlinkRunner mid-bundle output watermark handling
[nielm] [BEAM-14405] Fix NPE when ProjectID is not specified in a template
[bulat.safiullin] [BEAM-14382] change mobile banner img, add padding to banner section
[ahmedabualsaud] fix test decotrator typo
[noreply] Merge pull request #17440 from [BEAM-14329] Enable exponential backoff
[noreply] [BEAM-11104] Fix output forwarding issue for ProcessContinuations
[noreply] re-add testing package to pydoc (#17524)
[Heejong Lee] add test
[noreply] [BEAM-14250] Amended the workaround (#17531)
[noreply] [BEAM-11104] Fix broken split result validation (#17546)
[noreply] Fixed a SQL and screenshots in the Beam SQL blog (#17545)
[noreply] Merge pull request #17417: [BEAM-14388] Address some performance
[noreply] [BEAM-14386] [Flink] Support for scala 2.12 (#17512)
[noreply] [BEAM-14294] Worker changes to support trivial Batched DoFns (#17384)
[zyichi] Moving to 2.40.0-SNAPSHOT on master branch.
[noreply] [BEAM-14048] [CdapIO] Add ConfigWrapper for building CDAP PluginConfigs
------------------------------------------
[...truncated 33.80 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload: "\n\032\n\013NumElements\032\013:\t\n\003int\032\002\020\004\n\034\n\rInitialSplits\032\013:\t\n\003int\032\002\020\004\n\026\n\007KeySize\032\013:\t\n\003int\032\002\020\004\n\030\n\tValueSize\032\013:\t\n\003int\032\002\020\004\n\031\n\nNumHotKeys\032\013:\t\n\003int\032\002\020\004\n\024\n\016HotKeyFraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/05/05 08:43:43 Using specified **** binary: 'linux_amd64/combine'
2022/05/05 08:43:43 Prepared job with id: load-tests-go-flink-batch-combine-1-0505065313_2d392be2-ce7a-4dbf-9a7b-876aec829cd0 and staging token: load-tests-go-flink-batch-combine-1-0505065313_2d392be2-ce7a-4dbf-9a7b-876aec829cd0
2022/05/05 08:43:46 Staged binary artifact with token:
2022/05/05 08:43:48 Submitted job: load0tests0go0flink0batch0combine0100505065313-root-0505084346-a04f077c_6c454191-26e8-4823-a667-7ef54d75a36e
2022/05/05 08:43:48 Job state: STOPPED
2022/05/05 08:43:48 Job state: STARTING
2022/05/05 08:43:48 Job state: RUNNING
2022/05/05 08:43:48 (): org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: java.lang.IllegalArgumentException: Encountered unsupported logical type URN: int
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
at org.apache.beam.runners.core.construction.RehydratedComponents.getCoder(RehydratedComponents.java:168)
at org.apache.beam.runners.fnexecution.wire.WireCoders.instantiateRunnerWireCoder(WireCoders.java:94)
at org.apache.beam.runners.fnexecution.wire.WireCoders.instantiateRunnerWireCoder(WireCoders.java:75)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.translateExecutableStage(FlinkBatchPortablePipelineTranslator.java:311)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.translate(FlinkBatchPortablePipelineTranslator.java:272)
at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator.translate(FlinkBatchPortablePipelineTranslator.java:118)
at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:115)
at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: java.lang.IllegalArgumentException: Encountered unsupported logical type URN: int
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
at org.apache.beam.runners.core.construction.RehydratedComponents.getCoder(RehydratedComponents.java:168)
at org.apache.beam.runners.core.construction.CoderTranslation.fromKnownCoder(CoderTranslation.java:158)
at org.apache.beam.runners.core.construction.CoderTranslation.fromProto(CoderTranslation.java:145)
at org.apache.beam.runners.core.construction.RehydratedComponents$2.load(RehydratedComponents.java:87)
at org.apache.beam.runners.core.construction.RehydratedComponents$2.load(RehydratedComponents.java:82)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
... 18 more
Caused by: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: java.lang.IllegalArgumentException: Encountered unsupported logical type URN: int
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
at org.apache.beam.runners.core.construction.RehydratedComponents.getCoder(RehydratedComponents.java:168)
at org.apache.beam.runners.core.construction.CoderTranslation.fromKnownCoder(CoderTranslation.java:158)
at org.apache.beam.runners.core.construction.CoderTranslation.fromProto(CoderTranslation.java:145)
at org.apache.beam.runners.core.construction.RehydratedComponents$2.load(RehydratedComponents.java:87)
at org.apache.beam.runners.core.construction.RehydratedComponents$2.load(RehydratedComponents.java:82)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
... 30 more
Caused by: org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.UncheckedExecutionException: java.lang.IllegalArgumentException: Encountered unsupported logical type URN: int
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2050)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.get(LocalCache.java:3952)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache.getOrLoad(LocalCache.java:3974)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LocalLoadingCache.get(LocalCache.java:4958)
at org.apache.beam.runners.core.construction.RehydratedComponents.getCoder(RehydratedComponents.java:168)
at org.apache.beam.runners.core.construction.CoderTranslation.fromKnownCoder(CoderTranslation.java:158)
at org.apache.beam.runners.core.construction.CoderTranslation.fromProto(CoderTranslation.java:145)
at org.apache.beam.runners.core.construction.RehydratedComponents$2.load(RehydratedComponents.java:87)
at org.apache.beam.runners.core.construction.RehydratedComponents$2.load(RehydratedComponents.java:82)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
... 42 more
Caused by: java.lang.IllegalArgumentException: Encountered unsupported logical type URN: int
at org.apache.beam.sdk.schemas.SchemaTranslation.fieldTypeFromProtoWithoutNullable(SchemaTranslation.java:328)
at org.apache.beam.sdk.schemas.SchemaTranslation.fieldTypeFromProto(SchemaTranslation.java:244)
at org.apache.beam.sdk.schemas.SchemaTranslation.fieldFromProto(SchemaTranslation.java:238)
at org.apache.beam.sdk.schemas.SchemaTranslation.schemaFromProto(SchemaTranslation.java:212)
at org.apache.beam.runners.core.construction.CoderTranslators$8.fromComponents(CoderTranslators.java:169)
at org.apache.beam.runners.core.construction.CoderTranslators$8.fromComponents(CoderTranslators.java:151)
at org.apache.beam.runners.core.construction.CoderTranslation.fromKnownCoder(CoderTranslation.java:170)
at org.apache.beam.runners.core.construction.CoderTranslation.fromProto(CoderTranslation.java:145)
at org.apache.beam.runners.core.construction.RehydratedComponents$2.load(RehydratedComponents.java:87)
at org.apache.beam.runners.core.construction.RehydratedComponents$2.load(RehydratedComponents.java:82)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$LoadingValueReference.loadFuture(LocalCache.java:3528)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.loadSync(LocalCache.java:2277)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.lockedGetOrLoad(LocalCache.java:2154)
at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.cache.LocalCache$Segment.get(LocalCache.java:2044)
... 54 more
2022/05/05 08:43:48 (): java.lang.IllegalArgumentException: Encountered unsupported logical type URN: int
2022/05/05 08:43:48 Job state: FAILED
2022/05/05 08:43:48 Failed to execute job: job load0tests0go0flink0batch0combine0100505065313-root-0505084346-a04f077c_6c454191-26e8-4823-a667-7ef54d75a36e failed
panic: Failed to execute job: job load0tests0go0flink0batch0combine0100505065313-root-0505084346-a04f077c_6c454191-26e8-4823-a667-7ef54d75a36e failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf(0x123cb48, 0xc0000480c0, 0x11182df, 0x19, 0xc000651e78, 0x1, 0x1)
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xec
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:80 +0x414
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 25s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/ip3wbcpv3cccm
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscribe@beam.apache.org
For additional commands, e-mail: builds-help@beam.apache.org