You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiaochen Ouyang (JIRA)" <ji...@apache.org> on 2017/01/12 00:52:16 UTC
[jira] [Commented] (SPARK-19115) SparkSQL unsupports the command "
create external table if not exist new_tbl like old_tbl"
[ https://issues.apache.org/jira/browse/SPARK-19115?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15819716#comment-15819716 ]
Xiaochen Ouyang commented on SPARK-19115:
-----------------------------------------
May I ask you whether Spark supports the following comman or not:create external table if not exists gen_tbl like src_tbl location '/warehouse/data/gen_tbl' later version?
Do you have a plan to support this command in the future?
> SparkSQL unsupports the command " create external table if not exist new_tbl like old_tbl"
> -------------------------------------------------------------------------------------------
>
> Key: SPARK-19115
> URL: https://issues.apache.org/jira/browse/SPARK-19115
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.0.1
> Environment: spark2.0.1 hive1.2.1
> Reporter: Xiaochen Ouyang
>
> spark2.0.1 unsupported the command " create external table if not exist new_tbl like old_tbl"
> we tried to modify the sqlbase.g4 file,change
> " | CREATE TABLE (IF NOT EXISTS)? target=tableIdentifier
> LIKE source=tableIdentifier #createTableLike"
> to
> " | CREATE EXTERNAL? TABLE (IF NOT EXISTS)? target=tableIdentifier
> LIKE source=tableIdentifier #createTableLike"
> and then we compiled spark and replaced the jar "spark-catalyst-2.0.1.jar" ,after that,we found we can run command "create external table if not exist new_tbl like old_tbl" successfully,unfortunately we found the generated table's type is MANAGED_TABLE other than EXTERNAL_TABLE in metastore database .
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org