You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (Jira)" <ji...@apache.org> on 2019/08/20 06:43:00 UTC
[jira] [Assigned] (SPARK-28662) Create Hive Partitioned Table
without specifying data type for partition columns will success
unexpectedly
[ https://issues.apache.org/jira/browse/SPARK-28662?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan reassigned SPARK-28662:
-----------------------------------
Assignee: Li Hao
> Create Hive Partitioned Table without specifying data type for partition columns will success unexpectedly
> ------------------------------------------------------------------------------------------------------------
>
> Key: SPARK-28662
> URL: https://issues.apache.org/jira/browse/SPARK-28662
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.0.0
> Reporter: Li Hao
> Assignee: Li Hao
> Priority: Minor
> Fix For: 3.0.0
>
>
> *Case :*
> Create Hive Partitioned Table without specifying data type for partition column will success unexpectly.
> {code:java}
> // create a hive table partition by b, but the data type of b isn't specified.
> CREATE TABLE tbl(a int) PARTITIONED BY (b) STORED AS parquet
> {code}
>
> *Root Cause:*
> In https://issues.apache.org/jira/browse/SPARK-26435 , PARTITIONED BY clause are extended to support Hive CTAS as following:
> {code:java}
> // Before
> (PARTITIONED BY '(' partitionColumns=colTypeList ')’
> //After
> (PARTITIONED BY '(' partitionColumns=colTypeList ‘)’|
> PARTITIONED BY partitionColumnNames=identifierList) |
> {code}
> Create Table Statement like above case will pass the syntax check, and recognized as (PARTITIONED BY partitionColumnNames=identifierList) 。
> We should check this case in visitCreateHiveTable and give a explicit error message to user
>
--
This message was sent by Atlassian Jira
(v8.3.2#803003)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org