You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yin Huai (JIRA)" <ji...@apache.org> on 2016/06/01 07:04:13 UTC

[jira] [Commented] (SPARK-15691) Refactor and improve Hive support

    [ https://issues.apache.org/jira/browse/SPARK-15691?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15309438#comment-15309438 ] 

Yin Huai commented on SPARK-15691:
----------------------------------

I'd add removing HiveMetastoreCatalog as part of the work that moves Hive specific catalog logic into HiveExternalCatalog.

> Refactor and improve Hive support
> ---------------------------------
>
>                 Key: SPARK-15691
>                 URL: https://issues.apache.org/jira/browse/SPARK-15691
>             Project: Spark
>          Issue Type: New Feature
>          Components: SQL
>            Reporter: Reynold Xin
>
> Hive support is important to Spark SQL, as many Spark users use it to read from Hive. The current architecture is very difficult to maintain, and this ticket tracks progress towards getting us to a sane state.
> A number of things we want to accomplish are:
> - Remove HiveSessionCatalog. All Hive-related stuff should go into HiveExternalCatalog. This would require moving caching either into HiveExternalCatalog, or just into SessionCatalog.
> - Move the Hive specific catalog logic (e.g. using properties to store data source options) into HiveExternalCatalog.
> - Remove HIve's specific ScriptTransform implementation and make it more general so we can put it in sql/core.
> - Implement HiveTableScan (and write path) as a data source, so we don't need a special planner rule for HiveTableScan.
> - Remove HiveSharedState and HiveSessionState.
> One thing that is still unclear to me is how to work with Hive UDF support. We might still need a special planner rule there.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org