You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2016/03/24 20:43:25 UTC
[jira] [Resolved] (SPARK-14136) Spark 2.0 can't start with yarn
mode with ClassNotFoundException:
org.apache.spark.deploy.yarn.history.YarnHistoryService
[ https://issues.apache.org/jira/browse/SPARK-14136?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Marcelo Vanzin resolved SPARK-14136.
------------------------------------
Resolution: Not A Problem
{{YarnHistoryService}} is not a Spark class; you have to talk to the people who provide that code (I believe it's part of HDP).
> Spark 2.0 can't start with yarn mode with ClassNotFoundException: org.apache.spark.deploy.yarn.history.YarnHistoryService
> -------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-14136
> URL: https://issues.apache.org/jira/browse/SPARK-14136
> Project: Spark
> Issue Type: Bug
> Components: PySpark, Spark Shell, YARN
> Affects Versions: 2.0.0
> Environment: HortonworksHadoop2.7.1 HDP2.3.2 Java1.8.40
> Reporter: Qi Dai
>
> For the recent Spark nightly master builds (I tried current build and many of last couple weeks builds), the spark-shell/pyspark can't start in yarn mode with ClassNotFoundException: org.apache.spark.deploy.yarn.history.YarnHistoryService
> The full stack is:
> java.lang.ClassNotFoundException: org.apache.spark.deploy.yarn.history.YarnHistoryService
> at scala.reflect.internal.util.AbstractFileClassLoader.findClass(AbstractFileClassLoader.scala:62)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:348)
> at org.apache.spark.util.Utils$.classForName(Utils.scala:177)
> at org.apache.spark.scheduler.cluster.SchedulerExtensionServices$$anonfun$start$5.apply(SchedulerExtensionService.scala:109)
> at org.apache.spark.scheduler.cluster.SchedulerExtensionServices$$anonfun$start$5.apply(SchedulerExtensionService.scala:108)
> at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
> at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:245)
> at scala.collection.mutable.ArraySeq.foreach(ArraySeq.scala:74)
> at scala.collection.TraversableLike$class.map(TraversableLike.scala:245)
> at scala.collection.AbstractTraversable.map(Traversable.scala:104)
> at org.apache.spark.scheduler.cluster.SchedulerExtensionServices.start(SchedulerExtensionService.scala:108)
> at org.apache.spark.scheduler.cluster.YarnSchedulerBackend.start(YarnSchedulerBackend.scala:81)
> at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:62)
> at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:501)
> at org.apache.spark.repl.Main$.createSparkContext(Main.scala:89)
> ... 48 elided
> java.lang.NullPointerException
> at org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala:1020)
> at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:91)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
> at org.apache.spark.repl.Main$.createSQLContext(Main.scala:99)
> ... 48 elided
> <console>:13: error: not found: value sqlContext
> import sqlContext.implicits._
> ^
> <console>:13: error: not found: value sqlContext
> import sqlContext.sql
> ^
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org