You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Terry Moschou (JIRA)" <ji...@apache.org> on 2016/06/03 14:27:59 UTC

[jira] [Comment Edited] (SPARK-15747) Support SPARK_CONF_DIR/spark-defaults.d/*.conf drop-in style config files

    [ https://issues.apache.org/jira/browse/SPARK-15747?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15314181#comment-15314181 ] 

Terry Moschou edited comment on SPARK-15747 at 6/3/16 2:27 PM:
---------------------------------------------------------------

Sorry when I meant source multiple {{spark-defaults.d/*.conf}} property files, I meant programmatically loading, not shell script sourcing.

We use Ansible to install/configure spark and maintain our cluster/applications/etc. It would be handy to logically separate the installation from the configuration. For instance our Ansible plays to deploy spark history servers, which build on our plays to install spark could simply "drop-in" a 10-history-server.conf file with {{spark.history.*}} properties. 

There are obviously ways around this. E.g. Have Ansible mutate the spark-defaults.conf, or use a drop-in {{spark-env.d/*.sh}} shell script file (sourced by spark-env.sh) to build SPARK_HISTORY_OPTS.

But there are certainly lots of other use cases I can think of where this feature would be useful.


was (Author: tmoschou):
Sorry when I meant source multiple {{spark-defaults.d/*.conf}} property files, I meant programmatically loading, not shell script sourcing.

We use Ansible to install/configure spark and maintain our cluster/applications/etc. It would be handy to logically separate the installation from the configuration. For instance our Ansible plays to deploy spark history servers, which build on our plays to install spark could simply "drop-in" a 10-history-server.conf file with {{spark.history.*}} properties. 

I understand there are a few different ways around this. E.g. Have Ansible mutate the spark-defaults.conf, or use a drop-in {{spark-env.d/*.sh}} shell script file (sourced by spark-env.sh) to build SPARK_HISTORY_OPTS.

But there are certainly lots of other use cases I can think of where this feature would be useful.

> Support SPARK_CONF_DIR/spark-defaults.d/*.conf drop-in style config files
> -------------------------------------------------------------------------
>
>                 Key: SPARK-15747
>                 URL: https://issues.apache.org/jira/browse/SPARK-15747
>             Project: Spark
>          Issue Type: New Feature
>            Reporter: Terry Moschou
>
> Feature request to automatically source all files in {{SPARK_CONF_DIR/spark-defaults.d/*.conf}} along with spark-defaults.conf, so as to enable easier maintenance and deployment of spark defaults config. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org