You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2014/11/01 18:34:33 UTC
[jira] [Commented] (SPARK-4161) Spark shell class path is not
correctly set if "spark.driver.extraClassPath" is set in defaults.conf
[ https://issues.apache.org/jira/browse/SPARK-4161?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14193289#comment-14193289 ]
Apache Spark commented on SPARK-4161:
-------------------------------------
User 'witgo' has created a pull request for this issue:
https://github.com/apache/spark/pull/3050
> Spark shell class path is not correctly set if "spark.driver.extraClassPath" is set in defaults.conf
> ----------------------------------------------------------------------------------------------------
>
> Key: SPARK-4161
> URL: https://issues.apache.org/jira/browse/SPARK-4161
> Project: Spark
> Issue Type: Bug
> Components: Spark Shell
> Affects Versions: 1.1.0
> Environment: Mac, Ubuntu
> Reporter: Shay Seng
> Assignee: Guoqiang Li
>
> (1) I want to launch a spark-shell + with jars that are only required by the driver (ie. not shipped to slaves)
>
> (2) I added "spark.driver.extraClassPath /mypath/to.jar" to my spark-defaults.conf
> I launched spark-shell with: ./spark-shell
> Here I see on the WebUI that spark.driver.extraClassPath has been set, but I am NOT able to access any methods in the jar.
> (3) I removed "spark.driver.extraClassPath" from my spark-default.conf
> I launched spark-shell with ./spark-shell --driver.class.path /mypath/to.jar
> Again I see that the WebUI spark.driver.extraClassPath has been set.
> But this time I am able to access the methods in the jar.
> Looks like when the driver class path is loaded from spark-default.conf, the REPL's classpath is not correctly appended.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org