You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2014/06/10 22:35:02 UTC

[jira] [Updated] (SPARK-2098) All Spark processes should support spark-defaults.conf, config file

     [ https://issues.apache.org/jira/browse/SPARK-2098?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Marcelo Vanzin updated SPARK-2098:
----------------------------------

    Issue Type: Improvement  (was: Bug)

> All Spark processes should support spark-defaults.conf, config file
> -------------------------------------------------------------------
>
>                 Key: SPARK-2098
>                 URL: https://issues.apache.org/jira/browse/SPARK-2098
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.0.0
>            Reporter: Marcelo Vanzin
>
> SparkSubmit supports the idea of a config file to set SparkConf configurations. This is handy because you can easily set a site-wide configuration file, and power users can use their own when needed, or resort to JVM properties or other means of overriding configs.
> It would be nice if all Spark processes (e.g. master / worker / history server) also supported something like this. For daemon processes this is particularly interesting because it makes it easy to decouple starting the daemon (e.g. some /etc/init.d script packaged by some distribution) from configuring that daemon. Right now you have to set environment variables to modify the configuration of those daemons, which is not very friendly to the above scenario.



--
This message was sent by Atlassian JIRA
(v6.2#6252)