You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Saisai Shao (JIRA)" <ji...@apache.org> on 2016/07/13 07:25:20 UTC

[jira] [Created] (SPARK-16521) Add support of parameterized configuration for SparkConf

Saisai Shao created SPARK-16521:
-----------------------------------

             Summary: Add support of parameterized configuration for SparkConf
                 Key: SPARK-16521
                 URL: https://issues.apache.org/jira/browse/SPARK-16521
             Project: Spark
          Issue Type: Improvement
          Components: Spark Core
            Reporter: Saisai Shao


Current SparkConf is a key value pair mechanism, in which value is a literal string cannot be changed. In most of the use cases this key value pair system is enough to express the meanings, while in some cases it would be more convenient to make value as a parameterized variable which can be replaceable by other configurations.

One case is {{spark.sql.warehouse.dir}}, here the default value is is "file:${system:user.dir}/spark-warehouse" in which {{user.dir}} is replaced with system property in the runtime.

Also several configuration like:

{code}
spark.dynamicAllocation.minExecutors 1
spark.dynamicAllocation.initialExecutors 1
{code}

can also be configured as:

{code}
spark.dynamicAllocation.minExecutors 1
spark.dynamicAllocation.initialExecutors ${spark.dynamicAllocation.minExecutors}
{code}

So here propose to add parameterized configuration support in SparkConf, this will not change the original semantics of SparkConf, just add a more option to do the configuration.

This feature is quite useful in our environment, since we have some configurations that are version dependent, it is error prone and tedious to change it when environment is changed.

Please suggest and comment, thanks a lot.





--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org