You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/08/04 03:00:25 UTC

[jira] [Commented] (SPARK-16887) Add SPARK_DIST_CLASSPATH to LAUNCH_CLASSPATH

    [ https://issues.apache.org/jira/browse/SPARK-16887?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15407053#comment-15407053 ] 

Apache Spark commented on SPARK-16887:
--------------------------------------

User 'yhuai' has created a pull request for this issue:
https://github.com/apache/spark/pull/14492

> Add SPARK_DIST_CLASSPATH to LAUNCH_CLASSPATH
> --------------------------------------------
>
>                 Key: SPARK-16887
>                 URL: https://issues.apache.org/jira/browse/SPARK-16887
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>            Reporter: Yin Huai
>            Assignee: Yin Huai
>
> To deploy Spark, it can be pretty convenient to put all jars (spark jars, hadoop jars, and other libs' jars) that we want to include in the classpath of Spark in the same dir, which may not be spark's assembly dir. So, I am proposing to also add SPARK_DIST_CLASSPATH to the LAUNCH_CLASSPATH.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org