You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2019/02/20 14:50:00 UTC

[jira] [Commented] (SPARK-22709) move config related infrastructure from Spark Core to a new module

    [ https://issues.apache.org/jira/browse/SPARK-22709?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16773072#comment-16773072 ] 

Sean Owen commented on SPARK-22709:
-----------------------------------

Saw this late, but I'm not sure about it. This means a new module that everything depends on has configs for everything. I think it's not necessarily a win over, say, having {{config/package.scala}} or equivalent for individual modules. The existing dependency graph ought to let dependent modules, where needed, see these module-specific configs.

> move config related infrastructure from Spark Core to a new module
> ------------------------------------------------------------------
>
>                 Key: SPARK-22709
>                 URL: https://issues.apache.org/jira/browse/SPARK-22709
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 2.3.0
>            Reporter: Wenchen Fan
>            Priority: Major
>
> Nowadays the config related infrastructure is in the Spark Core module, and we use this infrastructure to centralize Spark configs, e.g. the `org.apache.spark.internal.config`, the SQLConf, etc.
> However, for modules that don't depend on Core, like the network module, we don't have this infrastructure and the configs are scattered in the code base.
> We should move the config infrastructure to a new module: spark-configs, so that all other modules can use this infrastructure to centralize their configs.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org