You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (Jira)" <ji...@apache.org> on 2020/09/17 06:52:00 UTC

[jira] [Assigned] (SPARK-32508) Disallow empty part col values in partition spec before static partition writing

     [ https://issues.apache.org/jira/browse/SPARK-32508?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Wenchen Fan reassigned SPARK-32508:
-----------------------------------

    Assignee: dzcxzl

> Disallow empty part col values in partition spec before static partition writing
> --------------------------------------------------------------------------------
>
>                 Key: SPARK-32508
>                 URL: https://issues.apache.org/jira/browse/SPARK-32508
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.1.0
>            Reporter: dzcxzl
>            Assignee: dzcxzl
>            Priority: Trivial
>
> When writing to the current static partition, the partition field is empty, and an error will be reported when all tasks are completed.
> We can prevent such behavior before submitting the task.
>  
> {code:java}
> org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: get partition: Value for key d is null or empty;
>     at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:113)
>     at org.apache.spark.sql.hive.HiveExternalCatalog.getPartitionOption(HiveExternalCatalog.scala:1212)
>     at org.apache.spark.sql.catalyst.catalog.ExternalCatalogWithListener.getPartitionOption(ExternalCatalogWithListener.scala:240)
>     at org.apache.spark.sql.hive.execution.InsertIntoHiveTable.processInsert(InsertIntoHiveTable.scala:276)
> {code}
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org