You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jeff Zhang (JIRA)" <ji...@apache.org> on 2015/09/08 04:10:45 UTC

[jira] [Commented] (SPARK-10481) SPARK_PREPEND_CLASSES make spark-yarn related jar could not be found

    [ https://issues.apache.org/jira/browse/SPARK-10481?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14734134#comment-14734134 ] 

Jeff Zhang commented on SPARK-10481:
------------------------------------

Working on it (Try to throw a more readable exception)

> SPARK_PREPEND_CLASSES make spark-yarn related jar could not be found
> --------------------------------------------------------------------
>
>                 Key: SPARK-10481
>                 URL: https://issues.apache.org/jira/browse/SPARK-10481
>             Project: Spark
>          Issue Type: Improvement
>          Components: YARN
>    Affects Versions: 1.4.1
>            Reporter: Jeff Zhang
>
> If SPARK_PREPEND_CLASSES, spark-yarn related jar won't be found. Because the org.apache.spark.deploy.Client is detected as separated class rather class in jar. 
> {code}
> 15/09/08 08:57:10 ERROR SparkContext: Error initializing SparkContext.
> java.util.NoSuchElementException: head of empty list
> 	at scala.collection.immutable.Nil$.head(List.scala:337)
> 	at scala.collection.immutable.Nil$.head(List.scala:334)
> 	at org.apache.spark.deploy.yarn.Client$.org$apache$spark$deploy$yarn$Client$$sparkJar(Client.scala:1048)
> 	at org.apache.spark.deploy.yarn.Client$.populateClasspath(Client.scala:1159)
> 	at org.apache.spark.deploy.yarn.Client.setupLaunchEnv(Client.scala:534)
> 	at org.apache.spark.deploy.yarn.Client.createContainerLaunchContext(Client.scala:645)
> 	at org.apache.spark.deploy.yarn.Client.submitApplication(Client.scala:119)
> 	at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:56)
> 	at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:144)
> 	at org.apache.spark.SparkContext.<init>(SparkContext.scala:514)
> 	at com.zjffdu.tutorial.spark.WordCount$.main(WordCount.scala:24)
> 	at com.zjffdu.tutorial.spark.WordCount.main(WordCount.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:680)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org