You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Matei Zaharia (JIRA)" <ji...@apache.org> on 2014/08/02 04:03:38 UTC
[jira] [Updated] (SPARK-2116) Load spark-defaults.conf from
directory specified by SPARK_CONF_DIR
[ https://issues.apache.org/jira/browse/SPARK-2116?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Matei Zaharia updated SPARK-2116:
---------------------------------
Assignee: Albert Chu
> Load spark-defaults.conf from directory specified by SPARK_CONF_DIR
> -------------------------------------------------------------------
>
> Key: SPARK-2116
> URL: https://issues.apache.org/jira/browse/SPARK-2116
> Project: Spark
> Issue Type: Improvement
> Components: Deploy
> Affects Versions: 1.0.0
> Reporter: Albert Chu
> Assignee: Albert Chu
> Priority: Minor
> Fix For: 1.1.0
>
> Attachments: SPARK-2116.patch
>
>
> Presently, spark-defaults.conf is loaded from SPARK_HOME/conf/spark-defaults.conf. As far as I can tell, the only way to specify an alternate one is to specify one on the command line via spark-submit.
> It would be convenient to have an environment variable to specify a constant alternate spark-defaults.conf. Via SPARK_CONF_DIR would be convenient, similar to HADOOP_CONF_DIR in Hadoop.
> Patch will be attached, github pull request will also be sent.
--
This message was sent by Atlassian JIRA
(v6.2#6252)