You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Stephen Boesch <ja...@gmail.com> on 2015/08/06 14:18:21 UTC

Spark-submit not finding main class and the error reflects different path to jar file than specified

Given the following command line to spark-submit:

bin/spark-submit --verbose --master local[2]--class
org.yardstick.spark.SparkCoreRDDBenchmark
/shared/ysgood/target/yardstick-spark-uber-0.0.1.jar

Here is the output:

NOTE: SPARK_PREPEND_CLASSES is set, placing locally compiled Spark classes
ahead of assembly.
Using properties file: /shared/spark-1.4.1/conf/spark-defaults.conf
Adding default property: spark.akka.askTimeout=180
Adding default property: spark.master=spark://mellyrn.local:7077
Error: Cannot load main class from JAR
file:/shared/spark-1.4.1/org.yardstick.spark.SparkCoreRDDBenchmark
Run with --help for usage help or --verbose for debug output


The path
"file:/shared/spark-1.4.1/org.yardstick.spark.SparkCoreRDDBenchmark"  does
not seem to make sense. It  does not reflect the path to the file that was
specified on the "submit-spark" command line.

Note: when attempting to run that jar file via

    java -classpath shared/ysgood/target/yardstick-spark-uber-0.0.1.jar
org.yardstick.spark.SparkCoreRDDBenchmark

Then the result is as expected: the main class starts to load and then
there is a NoClassDefFoundException on the SparkConf.classs (which is not
inside the jar). This shows the app jar is healthy.

Re: Spark-submit not finding main class and the error reflects different path to jar file than specified

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
Are you setting SPARK_PREPEND_CLASSES? try to disable it. Here your uber
jar which does not have the SparkConf is put in the first place of the
class-path which is messing it up.

Thanks
Best Regards

On Thu, Aug 6, 2015 at 5:48 PM, Stephen Boesch <ja...@gmail.com> wrote:

> Given the following command line to spark-submit:
>
> bin/spark-submit --verbose --master local[2]--class
> org.yardstick.spark.SparkCoreRDDBenchmark
> /shared/ysgood/target/yardstick-spark-uber-0.0.1.jar
>
> Here is the output:
>
> NOTE: SPARK_PREPEND_CLASSES is set, placing locally compiled Spark classes
> ahead of assembly.
> Using properties file: /shared/spark-1.4.1/conf/spark-defaults.conf
> Adding default property: spark.akka.askTimeout=180
> Adding default property: spark.master=spark://mellyrn.local:7077
> Error: Cannot load main class from JAR
> file:/shared/spark-1.4.1/org.yardstick.spark.SparkCoreRDDBenchmark
> Run with --help for usage help or --verbose for debug output
>
>
> The path
> "file:/shared/spark-1.4.1/org.yardstick.spark.SparkCoreRDDBenchmark"  does
> not seem to make sense. It  does not reflect the path to the file that was
> specified on the "submit-spark" command line.
>
> Note: when attempting to run that jar file via
>
>     java -classpath shared/ysgood/target/yardstick-spark-uber-0.0.1.jar
> org.yardstick.spark.SparkCoreRDDBenchmark
>
> Then the result is as expected: the main class starts to load and then
> there is a NoClassDefFoundException on the SparkConf.classs (which is not
> inside the jar). This shows the app jar is healthy.
>
>
>
>
>