You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shay Seng (JIRA)" <ji...@apache.org> on 2014/10/31 03:27:33 UTC

[jira] [Created] (SPARK-4161) Spark shell class path is not correctly set if "spark.driver.extraClassPath" is set in defaults.conf

Shay Seng created SPARK-4161:
--------------------------------

             Summary: Spark shell class path is not correctly set if "spark.driver.extraClassPath" is set in defaults.conf
                 Key: SPARK-4161
                 URL: https://issues.apache.org/jira/browse/SPARK-4161
             Project: Spark
          Issue Type: Bug
          Components: Spark Shell
    Affects Versions: 1.1.0
         Environment: Mac, Ubuntu
            Reporter: Shay Seng


(1) I want to launch a spark-shell + with jars that are only required by the driver (ie. not shipped to slaves)
 
(2) I added "spark.driver.extraClassPath  /mypath/to.jar" to my spark-defaults.conf
I launched spark-shell with:  ./spark-shell

Here I see on the WebUI that spark.driver.extraClassPath has been set, but I am NOT able to access any methods in the jar.

(3) I removed "spark.driver.extraClassPath" from my spark-default.conf
I launched spark-shell with  ./spark-shell --driver.class.path /mypath/to.jar

Again I see that the WebUI spark.driver.extraClassPath has been set. 
But this time I am able to access the methods in the jar. 


Looks like when the driver class path is loaded from spark-default.conf, the REPL's classpath is not correctly appended.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org