You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiao Li (JIRA)" <ji...@apache.org> on 2017/01/18 07:38:27 UTC
[jira] [Commented] (SPARK-19115) SparkSQL unsupports the command "
create external table if not exist new_tbl like old_tbl location
'/warehouse/new_tbl' "
[ https://issues.apache.org/jira/browse/SPARK-19115?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15827547#comment-15827547 ]
Xiao Li commented on SPARK-19115:
---------------------------------
Sorry for the late reply. This sounds reasonable. Does Hive support such a query?
> SparkSQL unsupports the command " create external table if not exist new_tbl like old_tbl location '/warehouse/new_tbl' "
> --------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-19115
> URL: https://issues.apache.org/jira/browse/SPARK-19115
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.0.1
> Environment: spark2.0.1 hive1.2.1
> Reporter: Xiaochen Ouyang
>
> spark2.0.1 unsupports the command " create external table if not exist new_tbl like old_tbl location '/warehouse/new_tbl' "
> we tried to modify the sqlbase.g4 file,change
> " | CREATE TABLE (IF NOT EXISTS)? target=tableIdentifier
> LIKE source=tableIdentifier #createTableLike"
> to
> " | CREATE EXTERNAL? TABLE (IF NOT EXISTS)? target=tableIdentifier
> LIKE source=tableIdentifier locationSpec? #createTableLike"
> modify method 'visitCreateTableLike' in scala file 'SparkSqlParser.scala' and update case class CreateTableLikeCommand in 'tables.scala' file
> finally we compiled spark and replaced the jars as follow: 'spark-catalyst-2.0.1.jar','spark-sql_2.11-2.0.1.jar', and run the command 'create external table if not exist new_tbl like old_tbl location '/warehouse/new_tbl' successfully .
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org