You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@toree.apache.org by "Liam Fisk (JIRA)" <ji...@apache.org> on 2016/06/06 22:02:21 UTC

[jira] [Commented] (TOREE-321) Classes defined in notebook are not available to remote executors

    [ https://issues.apache.org/jira/browse/TOREE-321?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15317355#comment-15317355 ] 

Liam Fisk commented on TOREE-321:
---------------------------------

I've noticed the spark environment doesn't have a spark.repl.class.uri defined, which may be a symptom of this problem

> Classes defined in notebook are not available to remote executors
> -----------------------------------------------------------------
>
>                 Key: TOREE-321
>                 URL: https://issues.apache.org/jira/browse/TOREE-321
>             Project: TOREE
>          Issue Type: Bug
>            Reporter: Liam Fisk
>
> A problem similar to SPARK-6299 is present with the current Toree code.
> Running against "--master local" succeeds with the following code, but running against a mesos cluster fails:
> {code}
> case class ClassA(value: String)
> val rdd = sc.parallelize(List(("k1", ClassA("v1")), ("k1", ClassA("v2")) ))
> rdd.groupByKey.collect
> {code}
> {code}
> Name: org.apache.spark.SparkException
> Message: Job aborted due to stage failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost task 0.3 in stage 0.0 (TID 3, drake1.orion.internal): java.lang.ClassNotFoundException: $line107.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$anonfun$udfHash$1
> 	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> 	at java.lang.Class.forName0(Native Method)
> 	at java.lang.Class.forName(Class.java:348)
> 	at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:68)
> 	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1620)
> 	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
> 	at scala.collection.immutable.$colon$colon.readObject(List.scala:366)
> 	at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1058)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1909)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
> 	at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1058)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1909)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
> 	at scala.collection.immutable.$colon$colon.readObject(List.scala:362)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1058)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1909)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
> 	at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
> 	at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
> 	at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:61)
> 	at org.apache.spark.scheduler.Task.run(Task.scala:89)
> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> Driver stacktrace:
> StackTrace: org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)
> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
> scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
> org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)
> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
> scala.Option.foreach(Option.scala:236)
> org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640)
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599)
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588)
> org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
> org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
> org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)
> org.apache.spark.SparkContext.runJob(SparkContext.scala:1845)
> org.apache.spark.SparkContext.runJob(SparkContext.scala:1858)
> org.apache.spark.sql.execution.SparkPlan.executeTake(SparkPlan.scala:212)
> org.apache.spark.sql.execution.Limit.executeCollect(basicOperators.scala:165)
> org.apache.spark.sql.execution.SparkPlan.executeCollectPublic(SparkPlan.scala:174)
> org.apache.spark.sql.DataFrame$$anonfun$org$apache$spark$sql$DataFrame$$execute$1$1.apply(DataFrame.scala:1499)
> org.apache.spark.sql.DataFrame$$anonfun$org$apache$spark$sql$DataFrame$$execute$1$1.apply(DataFrame.scala:1499)
> org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:56)
> org.apache.spark.sql.DataFrame.withNewExecutionId(DataFrame.scala:2086)
> org.apache.spark.sql.DataFrame.org$apache$spark$sql$DataFrame$$execute$1(DataFrame.scala:1498)
> org.apache.spark.sql.DataFrame.org$apache$spark$sql$DataFrame$$collect(DataFrame.scala:1505)
> org.apache.spark.sql.DataFrame$$anonfun$head$1.apply(DataFrame.scala:1375)
> org.apache.spark.sql.DataFrame$$anonfun$head$1.apply(DataFrame.scala:1374)
> org.apache.spark.sql.DataFrame.withCallback(DataFrame.scala:2099)
> org.apache.spark.sql.DataFrame.head(DataFrame.scala:1374)
> org.apache.spark.sql.DataFrame.take(DataFrame.scala:1456)
> org.apache.spark.sql.DataFrame.showString(DataFrame.scala:170)
> org.apache.spark.sql.DataFrame.show(DataFrame.scala:350)
> org.apache.spark.sql.DataFrame.show(DataFrame.scala:311)
> org.apache.spark.sql.DataFrame.show(DataFrame.scala:319)
> $line108.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:74)
> $line108.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:80)
> $line108.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:82)
> $line108.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:84)
> $line108.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:86)
> $line108.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:88)
> $line108.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:90)
> $line108.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:92)
> $line108.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:94)
> $line108.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:96)
> $line108.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:98)
> $line108.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:100)
> $line108.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:102)
> $line108.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:104)
> $line108.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:106)
> $line108.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:108)
> $line108.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:110)
> $line108.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:112)
> $line108.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:114)
> $line108.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:116)
> $line108.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:118)
> $line108.$read$$iwC$$iwC$$iwC.<init>(<console>:120)
> $line108.$read$$iwC$$iwC.<init>(<console>:122)
> $line108.$read$$iwC.<init>(<console>:124)
> $line108.$read.<init>(<console>:126)
> $line108.$read$.<init>(<console>:130)
> $line108.$read$.<clinit>(<console>)
> $line108.$eval$.<init>(<console>:7)
> $line108.$eval$.<clinit>(<console>)
> $line108.$eval.$print(<console>)
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> java.lang.reflect.Method.invoke(Method.java:498)
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
> org.apache.toree.kernel.interpreter.scala.ScalaInterpreter$$anonfun$interpretAddTask$1$$anonfun$apply$3.apply(ScalaInterpreter.scala:361)
> org.apache.toree.kernel.interpreter.scala.ScalaInterpreter$$anonfun$interpretAddTask$1$$anonfun$apply$3.apply(ScalaInterpreter.scala:356)
> org.apache.toree.global.StreamState$.withStreams(StreamState.scala:81)
> org.apache.toree.kernel.interpreter.scala.ScalaInterpreter$$anonfun$interpretAddTask$1.apply(ScalaInterpreter.scala:355)
> org.apache.toree.kernel.interpreter.scala.ScalaInterpreter$$anonfun$interpretAddTask$1.apply(ScalaInterpreter.scala:355)
> org.apache.toree.utils.TaskManager$$anonfun$add$2$$anon$1.run(TaskManager.scala:140)
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> java.lang.Thread.run(Thread.java:745)
> In [5]:
> case class ClassA(value: String)
> val rdd = sc.parallelize(List(("k1", ClassA("v1")), ("k1", ClassA("v2")) ))
> rdd.groupByKey.collect
> case class ClassA(value: String)
> val rdd = sc.parallelize(List(("k1", ClassA("v1")), ("k1", ClassA("v2")) ))
> rdd.groupByKey.collect
> Out[5]:
> Name: org.apache.spark.SparkException
> Message: Job aborted due to stage failure: Task 5 in stage 1.0 failed 4 times, most recent failure: Lost task 5.3 in stage 1.0 (TID 27, drake1.orion.internal): java.io.IOException: java.lang.ClassNotFoundException: $line113.$read$$iwC$$iwC$ClassA
> 	at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1207)
> 	at org.apache.spark.rdd.ParallelCollectionPartition.readObject(ParallelCollectionRDD.scala:70)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1058)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1909)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
> 	at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
> 	at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
> 	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:194)
> 	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> 	at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.ClassNotFoundException: $line113.$read$$iwC$$iwC$ClassA
> 	at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> 	at java.lang.Class.forName0(Native Method)
> 	at java.lang.Class.forName(Class.java:348)
> 	at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:68)
> 	at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1620)
> 	at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.readArray(ObjectInputStream.java:1714)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1347)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
> 	at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1942)
> 	at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
> 	at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
> 	at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2018)
> 	at java.io.ObjectInputStream.defaultReadObject(ObjectInputStream.java:503)
> 	at org.apache.spark.rdd.ParallelCollectionPartition$$anonfun$readObject$1.apply$mcV$sp(ParallelCollectionRDD.scala:74)
> 	at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1204)
> 	... 20 more
> Driver stacktrace:
> StackTrace: org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1431)
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1419)
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1418)
> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
> scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
> org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1418)
> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
> org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:799)
> scala.Option.foreach(Option.scala:236)
> org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:799)
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1640)
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1599)
> org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1588)
> org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
> org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:620)
> org.apache.spark.SparkContext.runJob(SparkContext.scala:1832)
> org.apache.spark.SparkContext.runJob(SparkContext.scala:1845)
> org.apache.spark.SparkContext.runJob(SparkContext.scala:1858)
> org.apache.spark.SparkContext.runJob(SparkContext.scala:1929)
> org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:927)
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)
> org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)
> org.apache.spark.rdd.RDD.withScope(RDD.scala:316)
> org.apache.spark.rdd.RDD.collect(RDD.scala:926)
> $line115.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:64)
> $line115.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:69)
> $line115.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:71)
> $line115.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:73)
> $line115.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:75)
> $line115.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:77)
> $line115.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:79)
> $line115.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:81)
> $line115.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:83)
> $line115.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:85)
> $line115.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:87)
> $line115.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:89)
> $line115.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:91)
> $line115.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:93)
> $line115.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:95)
> $line115.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:97)
> $line115.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:99)
> $line115.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:101)
> $line115.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:103)
> $line115.$read$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:105)
> $line115.$read$$iwC$$iwC$$iwC$$iwC.<init>(<console>:107)
> $line115.$read$$iwC$$iwC$$iwC.<init>(<console>:109)
> $line115.$read$$iwC$$iwC.<init>(<console>:111)
> $line115.$read$$iwC.<init>(<console>:113)
> $line115.$read.<init>(<console>:115)
> $line115.$read$.<init>(<console>:119)
> $line115.$read$.<clinit>(<console>)
> $line115.$eval$.<init>(<console>:7)
> $line115.$eval$.<clinit>(<console>)
> $line115.$eval.$print(<console>)
> sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> java.lang.reflect.Method.invoke(Method.java:498)
> org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
> org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
> org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
> org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
> org.apache.toree.kernel.interpreter.scala.ScalaInterpreter$$anonfun$interpretAddTask$1$$anonfun$apply$3.apply(ScalaInterpreter.scala:361)
> org.apache.toree.kernel.interpreter.scala.ScalaInterpreter$$anonfun$interpretAddTask$1$$anonfun$apply$3.apply(ScalaInterpreter.scala:356)
> org.apache.toree.global.StreamState$.withStreams(StreamState.scala:81)
> org.apache.toree.kernel.interpreter.scala.ScalaInterpreter$$anonfun$interpretAddTask$1.apply(ScalaInterpreter.scala:355)
> org.apache.toree.kernel.interpreter.scala.ScalaInterpreter$$anonfun$interpretAddTask$1.apply(ScalaInterpreter.scala:355)
> org.apache.toree.utils.TaskManager$$anonfun$add$2$$anon$1.run(TaskManager.scala:140)
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> java.lang.Thread.run(Thread.java:745)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)