You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@beam.apache.org by "Kenneth Knowles (Jira)" <ji...@apache.org> on 2022/04/14 18:03:00 UTC

[jira] [Updated] (BEAM-14174) Flink Tests failure : java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions

     [ https://issues.apache.org/jira/browse/BEAM-14174?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Kenneth Knowles updated BEAM-14174:
-----------------------------------
    Priority: P1  (was: P2)

> Flink Tests failure :  java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions 
> --------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: BEAM-14174
>                 URL: https://issues.apache.org/jira/browse/BEAM-14174
>             Project: Beam
>          Issue Type: Bug
>          Components: test-failures
>            Reporter: Andoni Guzman
>            Priority: P1
>              Labels: currently-failing
>
> This flink load tests are affected by this error 
> - beam_LoadTests_Go_ParDo_Flink_Batch (FAILED)
> -- beam_LoadTests_Go_SideInput_Flink_Batch (FAILED)
> -- beam_LoadTests_Python_Combine_Flink_Batch (FAILED)
> -- beam_LoadTests_Python_Combine_Flink_Streaming (FAILED)
> -- beam_LoadTests_Python_ParDo_Flink_Batch (FAILED)
> -- beam_LoadTests_Python_ParDo_Flink_Streaming (FAILED)
>  
> Stacktrace of the error 
> 14:03:35 root_transform_ids: "e11"
> 14:03:35 root_transform_ids: "e12"
> 14:03:35 root_transform_ids: "e13"
> 14:03:35 root_transform_ids: "e14"
> 14:03:35 requirements: "beam:requirement:org.apache.beam:pardo:splittable_dofn:v1"
> 14:03:35 2022/03/07 20:03:33 Prepared job with id: load-tests-go-flink-batch-pardo-1-0307182650_12830e27-009a-44e5-8390-a53ce5f264f0 and staging token: load-tests-go-flink-batch-pardo-1-0307182650_12830e27-009a-44e5-8390-a53ce5f264f0
> 14:03:35 2022/03/07 20:03:33 Using specified worker binary: 'linux_amd64/pardo'
> 14:03:46 2022/03/07 20:03:46 Staged binary artifact with token: 
> 14:03:47 2022/03/07 20:03:47 Submitted job: load0tests0go0flink0batch0pardo0100307182650-root-0307200346-d0077c8_166b7cfd-0805-481a-9a6a-52ba9bfddcbc
> 14:03:47 2022/03/07 20:03:47 Job state: STOPPED
> 14:03:47 2022/03/07 20:03:47 Job state: STARTING
> 14:03:47 2022/03/07 20:03:47 Job state: RUNNING
> 14:04:04 2022/03/07 20:04:04  (): org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: a8fc45aac0e16ee159027398311e082e)
> 14:04:04 	at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:125)
> 14:04:04 	at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
> 14:04:04 	at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
> 14:04:04 	at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
> 14:04:04 	at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
> 14:04:04 	at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$22(RestClusterClient.java:665)
> 14:04:04 	at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
> 14:04:04 	at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
> 14:04:04 	at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
> 14:04:04 	at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
> 14:04:04 	at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:394)
> 14:04:04 	at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
> 14:04:04 	at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
> 14:04:04 	at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
> 14:04:04 	at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
> 14:04:04 	at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
> 14:04:04 	at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
> 14:04:04 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
> 14:04:04 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
> 14:04:04 	at java.lang.Thread.run(Thread.java:748)
> 14:04:04 Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
> 14:04:04 	at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
> 14:04:04 	at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:123)
> 14:04:04 	... 19 more
> 14:04:04 Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
> 14:04:04 	at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:118)
> 14:04:04 	at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:80)
> 14:04:04 	at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:233)
> 14:04:04 	at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:224)
> 14:04:04 	at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:215)
> 14:04:04 	at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:666)
> 14:04:04 	at org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:89)
> 14:04:04 	at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:446)
> 14:04:04 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 14:04:04 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 14:04:04 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 14:04:04 	at java.lang.reflect.Method.invoke(Method.java:498)
> 14:04:04 	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:305)
> 14:04:04 	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:212)
> 14:04:04 	at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:77)
> 14:04:04 	at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:158)
> 14:04:04 	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
> 14:04:04 	at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
> 14:04:04 	at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
> 14:04:04 	at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
> 14:04:04 	at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
> 14:04:04 	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
> 14:04:04 	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
> 14:04:04 	at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
> 14:04:04 	at akka.actor.Actor.aroundReceive(Actor.scala:517)
> 14:04:04 	at akka.actor.Actor.aroundReceive$(Actor.scala:515)
> 14:04:04 	at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
> 14:04:04 	at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
> 14:04:04 	at akka.actor.ActorCell.invoke(ActorCell.scala:561)
> 14:04:04 	at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
> 14:04:04 	at akka.dispatch.Mailbox.run(Mailbox.scala:225)
> 14:04:04 	at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
> 14:04:04 	at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> 14:04:04 	at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> 14:04:04 	at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> 14:04:04 	at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> 14:04:04 Caused by: java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
> 14:04:04 	at java.io.ObjectStreamClass.hasStaticInitializer(Native Method)
> 14:04:04 	at java.io.ObjectStreamClass.computeDefaultSUID(ObjectStreamClass.java:1955)
> 14:04:04 	at java.io.ObjectStreamClass.access$100(ObjectStreamClass.java:79)
> 14:04:04 	at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:275)
> 14:04:04 	at java.io.ObjectStreamClass$2.run(ObjectStreamClass.java:273)
> 14:04:04 	at java.security.AccessController.doPrivileged(Native Method)
> 14:04:04 	at java.io.ObjectStreamClass.getSerialVersionUID(ObjectStreamClass.java:272)
> 14:04:04 	at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:694)
> 14:04:04 	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:2005)
> 14:04:04 	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1852)
> 14:04:04 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2186)
> 14:04:04 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
> 14:04:04 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2431)
> 14:04:04 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2355)
> 14:04:04 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2213)
> 14:04:04 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1669)
> 14:04:04 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:503)
> 14:04:04 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:461)
> 14:04:04 	at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:615)
> 14:04:04 	at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:600)
> 14:04:04 	at org.apache.flink.util.InstantiationUtil.deserializeObject(InstantiationUtil.java:587)
> 14:04:04 	at org.apache.flink.util.InstantiationUtil.readObjectFromConfig(InstantiationUtil.java:541)
> 14:04:04 	at org.apache.flink.api.java.typeutils.runtime.RuntimeSerializerFactory.readParametersFromConfig(RuntimeSerializerFactory.java:78)
> 14:04:04 	at org.apache.flink.runtime.operators.util.TaskConfig.getTypeSerializerFactory(TaskConfig.java:1246)
> 14:04:04 	at org.apache.flink.runtime.operators.util.TaskConfig.getOutputSerializer(TaskConfig.java:599)
> 14:04:04 	at org.apache.flink.runtime.operators.DataSourceTask.initInputFormat(DataSourceTask.java:324)
> 14:04:04 	at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:106)
> 14:04:04 	at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:758)
> 14:04:04 	at org.apache.flink.runtime.taskmanager.Task.run(Task.java:573)
> 14:04:04 	at java.lang.Thread.run(Thread.java:750)
> 14:04:04 2022/03/07 20:04:04  (): java.lang.NoClassDefFoundError: Could not initialize class org.apache.beam.runners.core.construction.SerializablePipelineOptions
> 14:04:05 2022/03/07 20:04:05 Job state: FAILED
> 14:04:05 2022/03/07 20:04:05 Failed to execute job: job load0tests0go0flink0batch0pardo0100307182650-root-0307200346-d0077c8_166b7cfd-0805-481a-9a6a-52ba9bfddcbc failed
> 14:04:05 panic: Failed to execute job: job load0tests0go0flink0batch0pardo0100307182650-root-0307200346-d0077c8_166b7cfd-0805-481a-9a6a-52ba9bfddcbc failed
> 14:04:05 
> 14:04:05 goroutine 1 [running]:
> 14:04:05 github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf(0x1143648, 0xc000120000, 0x10323eb, 0x19, 0xc0003cde00, 0x1, 0x1)
> 14:04:05 	/var/jenkins_real_home/workspace/beam_LoadTests_Go_ParDo_Flink_Batch/src/sdks/go/pkg/beam/log/log.go:153 +0xec
> 14:04:05 main.main()
> 14:04:05 	/var/jenkins_real_home/workspace/beam_LoadTests_Go_ParDo_Flink_Batch/src/sdks/go/test/load/pardo/pardo.go:105 +0x3ca
> 14:04:05 
> 14:04:05 > Task :sdks:go:test:load:run FAILED



--
This message was sent by Atlassian Jira
(v8.20.1#820001)