You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:14:07 UTC

[jira] [Resolved] (SPARK-19667) Create table with HiveEnabled in default database use warehouse path instead of the location of default database

     [ https://issues.apache.org/jira/browse/SPARK-19667?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-19667.
----------------------------------
    Resolution: Incomplete

> Create table with HiveEnabled in default database use warehouse path instead of the location of default database
> ----------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-19667
>                 URL: https://issues.apache.org/jira/browse/SPARK-19667
>             Project: Spark
>          Issue Type: New Feature
>          Components: SQL
>    Affects Versions: 2.1.0
>            Reporter: Song Jun
>            Priority: Major
>              Labels: bulk-closed
>
> Currently, when we create a managed table with HiveEnabled in default database,  Spark will use the location of default database as the table's location, this is ok in non-shared metastore.
> While if we use a shared metastore between different clusters, for example,
> 1) there is a hive metastore in Cluster-A, and the metastore use a remote mysql as its db, and create a default database in metastore, then the location of the default database is the path in Cluster-A
> 2) then we create another Cluster-B, and Cluster-B also use the same remote mysql as its metastore's db, so the default database conf in Cluster-B download from mysql, which location is the path of Cluster-A
> 3) then we create a table in Cluster-B in default database, it will throw an exception, that UnknowHost Cluster-A
> In Hive2.0.0, it is allowed to create a table in default database which shared between clusters , and this action is not allowed in other database, just for default.
> As a spark User, we will want to have the same action as Hive, thus we can create table in default databse



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org