You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Mulan (Jira)" <ji...@apache.org> on 2020/07/23 05:46:00 UTC
[jira] [Commented] (SPARK-31605) Unable to insert data with partial
dynamic partition with Spark & Hive 3
[ https://issues.apache.org/jira/browse/SPARK-31605?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17163233#comment-17163233 ]
Mulan commented on SPARK-31605:
-------------------------------
HDP 3.1.X can't use 2.4 version. Has any other way?
> Unable to insert data with partial dynamic partition with Spark & Hive 3
> ------------------------------------------------------------------------
>
> Key: SPARK-31605
> URL: https://issues.apache.org/jira/browse/SPARK-31605
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.3.2
> Environment: Hortonwork HDP 3.1.0
> Spark 2.3.2
> Hive 3
> Reporter: Amit Ashish
> Priority: Major
>
> When performing inserting data with dynamic partition, the operation fails if all partitions are not dynamic. For example:
>
> {code:sql}
> create external table test_insert(a int) partitioned by (part_a string, part_b string) stored as parquet location '<HDFS location>';
>
> {code}
> The query
> {code:sql}
> insert into table test_insert partition(part_a='a', part_b) values (3, 'b');
> {code}
> will fails with errors
> {code:xml}
> Cannot create partition spec from hdfs://xxxx/ ; missing keys [part_a]
> Ignoring invalid DP directory <path to staging directory>
> {code}
>
>
>
> On the other hand, if I remove the static value of part_a to make the insert fully dynamic, the following query will succeed. Please note that below is not the issue . Issue is above one , where query throws invalid DP directory warning.
> {code:sql}
> insert into table test_insert partition(part_a, part_b) values (1,'a','b');
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org