You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yin Huai (JIRA)" <ji...@apache.org> on 2016/08/04 02:53:20 UTC
[jira] [Created] (SPARK-16887) Add SPARK_DIST_CLASSPATH to
LAUNCH_CLASSPATH
Yin Huai created SPARK-16887:
--------------------------------
Summary: Add SPARK_DIST_CLASSPATH to LAUNCH_CLASSPATH
Key: SPARK-16887
URL: https://issues.apache.org/jira/browse/SPARK-16887
Project: Spark
Issue Type: Bug
Components: Spark Submit
Reporter: Yin Huai
To deploy Spark, it can be pretty convenient to put all jars (spark jars, hadoop jars, and other libs' jars) that we want to include in the classpath of Spark in the same dir, which may not be spark's assembly dir. So, I am proposing to also add SPARK_DIST_CLASSPATH to the LAUNCH_CLASSPATH.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org