You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/03/26 10:08:42 UTC

[jira] [Assigned] (SPARK-20100) Consolidate SessionState construction

     [ https://issues.apache.org/jira/browse/SPARK-20100?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-20100:
------------------------------------

    Assignee: Apache Spark  (was: Herman van Hovell)

> Consolidate SessionState construction
> -------------------------------------
>
>                 Key: SPARK-20100
>                 URL: https://issues.apache.org/jira/browse/SPARK-20100
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.2.0
>            Reporter: Herman van Hovell
>            Assignee: Apache Spark
>
> The current SessionState initialization path is quite complex. A part of the creation is done in the SessionState companion objects, a part of the creation is one inside the SessionState class, and a part is done by passing functions.
> The proposal is to consolidate the SessionState initialization into a builder class. This SessionState will not do any initialization and just becomes a place holder for the various Spark SQL internals. The advantages of this approach are the following:
> - SessionState initialization is less dispersed. The builder should be a one stop shop.
> - This provides us with a start for removing the HiveSessionState. Removing the hive session state would also require us to move resource loading into a separate class, and to (re)move metadata hive.
> - It is easier to customize the Spark Session. You just need to create a custom version of the builder. I will add hooks to make this easier. Opening up these API's will happen at a later point.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org