You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Takeshi Yamamuro (Jira)" <ji...@apache.org> on 2020/08/09 06:17:00 UTC
[jira] [Resolved] (SPARK-32563) spark-sql doesn't support insert
into mixed static & dynamic partition
[ https://issues.apache.org/jira/browse/SPARK-32563?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Takeshi Yamamuro resolved SPARK-32563.
--------------------------------------
Resolution: Not A Problem
> spark-sql doesn't support insert into mixed static & dynamic partition
> -----------------------------------------------------------------------
>
> Key: SPARK-32563
> URL: https://issues.apache.org/jira/browse/SPARK-32563
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.3.2
> Environment: HDP version 2.3.2.3.1.4.0-315
> Reporter: yx91490
> Priority: Major
> Attachments: SPARK-32563.log
>
>
> spark-sql doesn't support insert into mixed static & dynamic partition, for example:
> source table :
> {code:java}
> CREATE TABLE `id_name`(`id` int, `name` string)
> PARTITIONED BY (`dt` string)
> {code}
> dest table:
> {code:java}
> CREATE TABLE `id_name_dt1_dt2`(`id` int, `name` string)
> PARTITIONED BY (`dt1` string, `dt2` string)
> {code}
> insert sql:
> {code:java}
> insert into table tmp.id_name_dt1_dt2 partition(dt1='beijing',dt2) select * from tmp.id_name;
> {code}
> result:
> data not inserted, dest table partition not added.
> and there are two warns:
> {code:java}
> 20/08/07 14:32:28 WARN warehouse: Cannot create partition spec from hdfs://nameservice/; missing keys [dt1]
> 20/08/07 14:32:28 WARN FileOperations: Ignoring invalid DP directory hdfs://nameservice/user/hive/warehouse/tmp.db/id_name_dt1_dt2/.hive-staging_hive_2020-08-07_14-32-02_538_7897451753303149223-1/-ext-10000/dt2=2002
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org