You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Rosen (JIRA)" <ji...@apache.org> on 2014/12/31 21:02:13 UTC

[jira] [Resolved] (SPARK-4298) The spark-submit cannot read Main-Class from Manifest.

     [ https://issues.apache.org/jira/browse/SPARK-4298?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Josh Rosen resolved SPARK-4298.
-------------------------------
       Resolution: Fixed
    Fix Version/s: 1.2.1
                   1.1.2
                   1.3.0
                   1.0.3
         Assignee: Brennon York

This was fixed by [~boyork]'s PR, which I've merged into all of the maintenance branches.

> The spark-submit cannot read Main-Class from Manifest.
> ------------------------------------------------------
>
>                 Key: SPARK-4298
>                 URL: https://issues.apache.org/jira/browse/SPARK-4298
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.1.0
>         Environment: Linux
> spark-1.1.0-bin-hadoop2.4.tgz
> java version "1.7.0_72"
> Java(TM) SE Runtime Environment (build 1.7.0_72-b14)
> Java HotSpot(TM) 64-Bit Server VM (build 24.72-b04, mixed mode)
>            Reporter: Milan Straka
>            Assignee: Brennon York
>             Fix For: 1.0.3, 1.3.0, 1.1.2, 1.2.1
>
>
> Consider trivial {{test.scala}}:
> {code:title=test.scala|borderStyle=solid}
> import org.apache.spark.SparkContext
> import org.apache.spark.SparkContext._
> object Main {
>   def main(args: Array[String]) {
>     val sc = new SparkContext()
>     sc.stop()
>   }
> }
> {code}
> When built with {{sbt}} and executed using {{spark-submit target/scala-2.10/test_2.10-1.0.jar}}, I get the following error:
> {code}
> Spark assembly has been built with Hive, including Datanucleus jars on classpath
> Error: Cannot load main class from JAR: file:/ha/home/straka/s/target/scala-2.10/test_2.10-1.0.jar
> Run with --help for usage help or --verbose for debug output
> {code}
> When executed using {{spark-submit --class Main target/scala-2.10/test_2.10-1.0.jar}}, it works.
> The jar file has correct MANIFEST.MF:
> {code:title=MANIFEST.MF|borderStyle=solid}
> Manifest-Version: 1.0
> Implementation-Vendor: test
> Implementation-Title: test
> Implementation-Version: 1.0
> Implementation-Vendor-Id: test
> Specification-Vendor: test
> Specification-Title: test
> Specification-Version: 1.0
> Main-Class: Main
> {code}
> The problem is that in {{org.apache.spark.deploy.SparkSubmitArguments}}, line 127:
> {code}
>   val jar = new JarFile(primaryResource)
> {code}
> the primaryResource has String value {{"file:/ha/home/straka/s/target/scala-2.10/test_2.10-1.0.jar"}}, which is URI, but JarFile can use only Path. One way to fix this would be using
> {code}
>   val uri = new URI(primaryResource)
>   val jar = new JarFile(uri.getPath)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org