You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Barrington <ba...@me.com> on 2014/09/17 05:47:13 UTC

YARN mode not available error

Hi,

I am running Spark in cluster mode with Hadoop YARN as the underlying
cluster manager. I get this error when trying to initialize the
SparkContext. 


Exception in thread "main" org.apache.spark.SparkException: YARN mode not
available ?
	at
org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:1586)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:310)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:86)
	at LascoScript$.main(LascoScript.scala:24)
	at LascoScript.main(LascoScript.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
Caused by: java.lang.ClassNotFoundException:
org.apache.spark.scheduler.cluster.YarnClientClusterScheduler
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:190)
	at
org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:1580)




My build.sbt file  looks like this:



name := "LascoScript"

version := "1.0"

scalaVersion := "2.10.4"

val excludeJBossNetty = ExclusionRule(organization = "org.jboss.netty")
val excludeMortbayJetty = ExclusionRule(organization = "org.eclipse.jetty",
artifact = "jetty-server")
val excludeAsm = ExclusionRule(organization = "org.ow2.asm")
val excludeCommonsLogging = ExclusionRule(organization = "commons-logging")
val excludeSLF4J = ExclusionRule(organization = "org.slf4j")
val excludeOldAsm = ExclusionRule(organization = "asm")
val excludeServletApi = ExclusionRule(organization = "javax.servlet",
artifact = "servlet-api")


libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.0"
excludeAll(
 excludeServletApi, excludeMortbayJetty
)

libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.5.1"
excludeAll(
 excludeJBossNetty, excludeMortbayJetty, excludeAsm, excludeCommonsLogging,
excludeSLF4J, excludeOldAsm, excludeServletApi
 )

libraryDependencies += "org.mortbay.jetty" % "servlet-api" % "3.0.20100224"

libraryDependencies += "org.eclipse.jetty" % "jetty-server" %
"8.1.16.v20140903"

unmanagedJars in Compile ++= {
 val base = baseDirectory.value
 val baseDirectories = (base / "lib") +++ (base)
 val customJars = (baseDirectories ** "*.jar")
 customJars.classpath
}

resolvers += "Akka Repository" at "http://repo.akka.io/releases/“



How can I fix this issue?

- Barrington



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/YARN-mode-not-available-error-tp14420.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: YARN mode not available error

Posted by Sean Owen <so...@cloudera.com>.
Caused by: java.lang.ClassNotFoundException:
org.apache.spark.scheduler.cluster.YarnClientClusterScheduler

It sounds like you perhaps deployed a custom build of Spark that did
not include YARN support? you need -Pyarn in your build.

On Wed, Sep 17, 2014 at 4:47 AM, Barrington <ba...@me.com> wrote:
> Hi,
>
> I am running Spark in cluster mode with Hadoop YARN as the underlying
> cluster manager. I get this error when trying to initialize the
> SparkContext.
>
>
> Exception in thread "main" org.apache.spark.SparkException: YARN mode not
> available ?
>         at
> org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:1586)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:310)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:86)
>         at LascoScript$.main(LascoScript.scala:24)
>         at LascoScript.main(LascoScript.scala)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:606)
>         at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
> Caused by: java.lang.ClassNotFoundException:
> org.apache.spark.scheduler.cluster.YarnClientClusterScheduler
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:190)
>         at
> org.apache.spark.SparkContext$.org$apache$spark$SparkContext$$createTaskScheduler(SparkContext.scala:1580)
>
>
>
>
> My build.sbt file  looks like this:
>
>
>
> name := "LascoScript"
>
> version := "1.0"
>
> scalaVersion := "2.10.4"
>
> val excludeJBossNetty = ExclusionRule(organization = "org.jboss.netty")
> val excludeMortbayJetty = ExclusionRule(organization = "org.eclipse.jetty",
> artifact = "jetty-server")
> val excludeAsm = ExclusionRule(organization = "org.ow2.asm")
> val excludeCommonsLogging = ExclusionRule(organization = "commons-logging")
> val excludeSLF4J = ExclusionRule(organization = "org.slf4j")
> val excludeOldAsm = ExclusionRule(organization = "asm")
> val excludeServletApi = ExclusionRule(organization = "javax.servlet",
> artifact = "servlet-api")
>
>
> libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.0"
> excludeAll(
>  excludeServletApi, excludeMortbayJetty
> )
>
> libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.5.1"
> excludeAll(
>  excludeJBossNetty, excludeMortbayJetty, excludeAsm, excludeCommonsLogging,
> excludeSLF4J, excludeOldAsm, excludeServletApi
>  )
>
> libraryDependencies += "org.mortbay.jetty" % "servlet-api" % "3.0.20100224"
>
> libraryDependencies += "org.eclipse.jetty" % "jetty-server" %
> "8.1.16.v20140903"
>
> unmanagedJars in Compile ++= {
>  val base = baseDirectory.value
>  val baseDirectories = (base / "lib") +++ (base)
>  val customJars = (baseDirectories ** "*.jar")
>  customJars.classpath
> }
>
> resolvers += "Akka Repository" at "http://repo.akka.io/releases/“
>
>
>
> How can I fix this issue?
>
> - Barrington
>
>
>
> --
> View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/YARN-mode-not-available-error-tp14420.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org