You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/02/09 08:38:41 UTC
[jira] [Commented] (SPARK-19115) SparkSQL unsupports the command "
create external table if not exist new_tbl like old_tbl location
'/warehouse/new_tbl' "
[ https://issues.apache.org/jira/browse/SPARK-19115?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15859211#comment-15859211 ]
Apache Spark commented on SPARK-19115:
--------------------------------------
User 'ouyangxiaochen' has created a pull request for this issue:
https://github.com/apache/spark/pull/16868
> SparkSQL unsupports the command " create external table if not exist new_tbl like old_tbl location '/warehouse/new_tbl' "
> --------------------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-19115
> URL: https://issues.apache.org/jira/browse/SPARK-19115
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.0.1
> Environment: spark2.0.1 hive1.2.1
> Reporter: Xiaochen Ouyang
> Assignee: Xiao Li
>
> spark2.0.1 unsupports the command " create external table if not exist new_tbl like old_tbl location '/warehouse/new_tbl' "
> we tried to modify the sqlbase.g4 file,change
> " | CREATE TABLE (IF NOT EXISTS)? target=tableIdentifier
> LIKE source=tableIdentifier #createTableLike"
> to
> " | CREATE EXTERNAL? TABLE (IF NOT EXISTS)? target=tableIdentifier
> LIKE source=tableIdentifier locationSpec? #createTableLike"
> modify method 'visitCreateTableLike' in scala file 'SparkSqlParser.scala' and update case class CreateTableLikeCommand in 'tables.scala' file
> finally we compiled spark and replaced the jars as follow: 'spark-catalyst-2.0.1.jar','spark-sql_2.11-2.0.1.jar', and run the command 'create external table if not exist new_tbl like old_tbl location '/warehouse/new_tbl' successfully .
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org