You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Arun Ravi M V (Jira)" <ji...@apache.org> on 2019/11/28 04:01:00 UTC

[jira] [Updated] (SPARK-30022) Supporting Parsing of Simple Hive Virtual View created from Presto

     [ https://issues.apache.org/jira/browse/SPARK-30022?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Arun Ravi M V updated SPARK-30022:
----------------------------------
    Summary: Supporting Parsing of Simple Hive Virtual View created from Presto  (was: Supporting reading of Hive Virtual View created from Presto)

> Supporting Parsing of Simple Hive Virtual View created from Presto
> ------------------------------------------------------------------
>
>                 Key: SPARK-30022
>                 URL: https://issues.apache.org/jira/browse/SPARK-30022
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.4.4
>            Reporter: Arun Ravi M V
>            Priority: Major
>
> In Apache Spark, I found that views are resolved using the VIEW_EXPANDED_TEXT info from the Hive meta store. The CatalogTable.viewText is set in the HiveClientImpl Class. I would like to propose a modification that will allow us to extract the Presto view description from the hive and add an additional presto SQL parser. Spark currently supports the plugging of new parsers on top of existing ones. I was wondering if we could make Hive Client Impl pluggable so that users can modify the default implementation to support presto views. Presto views are stored in the Hive meta store in the Hive TBLS table's VIEW_ORIGINAL_TEXT column in encoded format.
> The assumption is that the user will provide spark compliant UDFs in case if they use them in view creation.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org