You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@beam.apache.org by "Aviem Zur (JIRA)" <ji...@apache.org> on 2017/04/23 04:27:04 UTC

[jira] [Resolved] (BEAM-2027) get error sometimes while running the same code using beam0.6

     [ https://issues.apache.org/jira/browse/BEAM-2027?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Aviem Zur resolved BEAM-2027.
-----------------------------
       Resolution: Not A Bug
    Fix Version/s: Not applicable

No problem!

> get error sometimes while running the same code using beam0.6
> -------------------------------------------------------------
>
>                 Key: BEAM-2027
>                 URL: https://issues.apache.org/jira/browse/BEAM-2027
>             Project: Beam
>          Issue Type: Bug
>          Components: runner-core, runner-spark
>         Environment: spark-1.6.2-bin-hadoop2.6, hadoop-2.6.0, source:hdfs sink:hdfs
>            Reporter: liyuntian
>            Assignee: Aviem Zur
>             Fix For: Not applicable
>
>
> run a yarn job using beam0.6.0, I get file from hdfs and write record to hdfs, I use spark-1.6.2-bin-hadoop2.6,hadoop-2.6.0. I get error sometime below, 
> 17/04/20 21:10:45 WARN scheduler.TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, etl-develop-003): java.io.InvalidClassException: org.apache.beam.runners.spark.coders.CoderHelpers$3; local class incompatible: stream classdesc serialVersionUID = 1334222146820528045, local class serialVersionUID = 5119956493581628999
> 	at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
> 	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
> 	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> 	at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> 	at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
> 	at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
> 	at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
> 	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
> 	at org.apache.spark.scheduler.Task.run(Task.scala:89)
> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:227)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> 	at java.lang.Thread.run(Thread.java:745)
> 17/04/20 21:10:45 INFO scheduler.TaskSetManager: Starting task 0.1 in stage 0.0 (TID 2, etl-develop-003, partition 0,PROCESS_LOCAL, 2080 bytes)
> 17/04/20 21:10:45 INFO scheduler.TaskSetManager: Lost task 0.1 in stage 0.0 (TID 2) on executor etl-develop-003: java.io.InvalidClassException (org.apache.beam.runners.spark.coders.CoderHelpers$3; local class incompatible: stream classdesc serialVersionUID = 1334222146820528045, local class serialVersionUID = 5119956493581628999) [duplicate 1]
> 17/04/20 21:10:45 INFO scheduler.TaskSetManager: Starting task 0.2 in stage 0.0 (TID 3, etl-develop-003, partition 0,PROCESS_LOCAL, 2080 bytes)
> 17/04/20 21:10:45 INFO scheduler.TaskSetManager: Lost task 0.2 in stage 0.0 (TID 3) on executor etl-develop-003: java.io.InvalidClassException (org.apache.beam.runners.spark.coders.CoderHelpers$3; local class incompatible: stream classdesc serialVersionUID = 1334222146820528045, local class serialVersionUID = 5119956493581628999) [duplicate 2]
> 17/04/20 21:10:45 INFO scheduler.TaskSetManager: Starting task 0.3 in stage 0.0 (TID 4, etl-develop-003, partition 0,PROCESS_LOCAL, 2080 bytes)
> 17/04/20 21:10:45 INFO scheduler.TaskSetManager: Lost task 0.3 in stage 0.0 (TID 4) on executor etl-develop-003: java.io.InvalidClassException (org.apache.beam.runners.spark.coders.CoderHelpers$3; local class incompatible: stream classdesc serialVersionUID = 1334222146820528045, local class serialVersionUID = 5119956493581628999) [duplicate 3]
> 17/04/20 21:10:45 ERROR scheduler.TaskSetManager: Task 0 in stage 0.0 failed 4 times; aborting job
> 17/04/20 21:10:45 INFO cluster.YarnClusterScheduler: Cancelling stage 0
> 17/04/20 21:10:45 INFO cluster.YarnClusterScheduler: Stage 0 was cancelled
> 17/04/20 21:10:45 INFO scheduler.DAGScheduler: ResultStage 0 (collect at BoundedDataset.java:88) failed in 1.260 s
> 17/04/20 21:10:45 INFO scheduler.DAGScheduler: Job 0 failed: collect at BoundedDataset.java:88, took 1.424075 s
> 17/04/20 21:10:46 ERROR yarn.ApplicationMaster: User class threw exception: org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.io.InvalidClassException: org.apache.beam.runners.spark.coders.CoderHelpers$3; local class incompatible: stream classdesc serialVersionUID = 1334222146820528045, local class serialVersionUID = 5119956493581628999
> org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.io.InvalidClassException: org.apache.beam.runners.spark.coders.CoderHelpers$3; local class incompatible: stream classdesc serialVersionUID = 1334222146820528045, local class serialVersionUID = 5119956493581628999
> 	at org.apache.beam.runners.spark.SparkPipelineResult.beamExceptionFrom(SparkPipelineResult.java:73)
> 	at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:113)
> 	at org.apache.beam.runners.spark.SparkPipelineResult.waitUntilFinish(SparkPipelineResult.java:102)
> 	at com.chinamobile.cmss.etl.beam.transform.server.SparkMain.main(SparkMain.java:96)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:558)
> Caused by: java.io.InvalidClassException: org.apache.beam.runners.spark.coders.CoderHelpers$3; local class incompatible: stream classdesc serialVersionUID = 1334222146820528045, local class serialVersionUID = 5119956493581628999
> 	at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:617)
> 	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1622)
> 	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)