You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yesha Vora (JIRA)" <ji...@apache.org> on 2016/07/06 06:53:11 UTC

[jira] [Resolved] (SPARK-15847) DecisionTreeRunner example stucks with "NoClassDefFoundError: org/apache/avro/generic/GenericRecord"

     [ https://issues.apache.org/jira/browse/SPARK-15847?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Yesha Vora resolved SPARK-15847.
--------------------------------
    Resolution: Cannot Reproduce

> DecisionTreeRunner example stucks with "NoClassDefFoundError: org/apache/avro/generic/GenericRecord"
> ----------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-15847
>                 URL: https://issues.apache.org/jira/browse/SPARK-15847
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.0.0
>            Reporter: Yesha Vora
>
> In Spark-2, DecisionTreeRunner example hangs with "NoClassDefFoundError: org/apache/avro/generic/GenericRecord".
> The same application passes in yarn-cluster mode. I'm hitting this issue with yarn-client mode only.
> {code}
> spark-submit  --class org.apache.spark.examples.mllib.DecisionTreeRunner --master yarn-client --jars hadoop-lzo-*.jar /xxx/lib/spark-examples_*jar  /tmp/sparkMLLInput/sample_libsvm_data.txt{code}
> {code}
> 16/05/27 02:37:50 INFO SparkContext: Starting job: countByValue at DecisionTreeRunner.scala:185
> Exception in thread "dag-scheduler-event-loop" java.lang.NoClassDefFoundError: org/apache/avro/generic/GenericRecord
> at org.apache.spark.serializer.KryoSerializer.newKryo(KryoSerializer.scala:112)
> at org.apache.spark.serializer.KryoSerializerInstance.borrowKryo(KryoSerializer.scala:274)
> at org.apache.spark.serializer.KryoSerializerInstance.<init>(KryoSerializer.scala:259)
> at org.apache.spark.serializer.KryoSerializer.newInstance(KryoSerializer.scala:175)
> at org.apache.spark.serializer.KryoSerializer.supportsRelocationOfSerializedObjects$lzycompute(KryoSerializer.scala:182)
> at org.apache.spark.serializer.KryoSerializer.supportsRelocationOfSerializedObjects(KryoSerializer.scala:178)
> at org.apache.spark.shuffle.sort.SortShuffleManager$.canUseSerializedShuffle(SortShuffleManager.scala:187)
> at org.apache.spark.shuffle.sort.SortShuffleManager.registerShuffle(SortShuffleManager.scala:99)
> at org.apache.spark.ShuffleDependency.<init>(Dependency.scala:90)
> at org.apache.spark.rdd.ShuffledRDD.getDependencies(ShuffledRDD.scala:91)
> at org.apache.spark.rdd.RDD$$anonfun$dependencies$2.apply(RDD.scala:234)
> at org.apache.spark.rdd.RDD$$anonfun$dependencies$2.apply(RDD.scala:232)
> at scala.Option.getOrElse(Option.scala:121)
> at org.apache.spark.rdd.RDD.dependencies(RDD.scala:232)
> at org.apache.spark.scheduler.DAGScheduler.visit$1(DAGScheduler.scala:391)
> at org.apache.spark.scheduler.DAGScheduler.getParentStages(DAGScheduler.scala:403)
> at org.apache.spark.scheduler.DAGScheduler.getParentStagesAndId(DAGScheduler.scala:304)
> at org.apache.spark.scheduler.DAGScheduler.newResultStage(DAGScheduler.scala:339)
> at org.apache.spark.scheduler.DAGScheduler.handleJobSubmitted(DAGScheduler.scala:849)
> at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1626)
> at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1618)
> at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1607)
> at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
> Caused by: java.lang.ClassNotFoundException: org.apache.avro.generic.GenericRecord
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> .. 23 more{code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org