You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Christian Tzolov (JIRA)" <ji...@apache.org> on 2014/09/21 11:53:33 UTC

[jira] [Created] (SPARK-3624) "Failed to find Spark assembly in /usr/share/spark/lib" for RELEASED debian packages

Christian Tzolov created SPARK-3624:
---------------------------------------

             Summary: "Failed to find Spark assembly in /usr/share/spark/lib" for RELEASED debian packages
                 Key: SPARK-3624
                 URL: https://issues.apache.org/jira/browse/SPARK-3624
             Project: Spark
          Issue Type: Bug
          Components: Build, Deploy
    Affects Versions: 1.1.0
            Reporter: Christian Tzolov
            Priority: Minor


The compute-classpath.sh requires that for a 'RELASED' package the Spark assembly jar is accessible from a <spark home>/lib folder.

Currently the jdeb packaging (assembly module) bundles the assembly jar into a folder called 'jars'. 

The result is :
/usr/share/spark/bin/spark-submit   --num-executors 10    --master yarn-cluster   --class org.apache.spark.examples.SparkPi   /usr/share/spark/jars/spark-examples-1.1.0-hadoop2.2.0-gphd-3.0.1.0.jar 10
ls: cannot access /usr/share/spark/lib: No such file or directory
Failed to find Spark assembly in /usr/share/spark/lib
You need to build Spark before running this program.

Trivial solution is to rename the '<prefix>${deb.install.path}/jars</prefix>' inside assembly/pom.xml to <prefix>${deb.install.path}/lib</prefix>.

Another less impactful (considering backward compatibility) solution is to define a lib->jars symlink in the assembly/pom.xml




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org