You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by "Zhaojing Yu (Jira)" <ji...@apache.org> on 2022/10/01 12:16:00 UTC

[jira] [Resolved] (HUDI-4237) spark.sql.sources.schema.partCol.0 is non-empty in HiveMetaStore when create non-partition hudi table in Spark

     [ https://issues.apache.org/jira/browse/HUDI-4237?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Zhaojing Yu resolved HUDI-4237.
-------------------------------

> spark.sql.sources.schema.partCol.0 is non-empty in HiveMetaStore when create non-partition hudi table in Spark
> --------------------------------------------------------------------------------------------------------------
>
>                 Key: HUDI-4237
>                 URL: https://issues.apache.org/jira/browse/HUDI-4237
>             Project: Apache Hudi
>          Issue Type: Bug
>            Reporter: du.junling
>            Priority: Critical
>              Labels: pull-request-available
>             Fix For: 0.12.1
>
>
> Create a non-partition hudi table in Spark,it will store spark.sql.sources.schema.partCol.0 with an empty value in hiveMetastore.
> This is unexpected behavior.
> Steps to reproduce the behavior:
> 1. Create a non-partition hudi table in Spark
> {code:java}
> create table hudi_mor_tbl (
> id int,
> name string,
> price double,
> ts bigint
> ) using hudi
> tblproperties (
> type = 'mor',
> primaryKey = 'id',
> preCombineField = 'ts'
> ) {code}
> 2. insert data one row to it.
> {code:java}
> insert into hudi_mor_tbl select 1, 'a1', 20, 1000; {code}
> 3. cat hoodie.properties in table's base path,
> it include partition.fields key with an empty value
> {code:java}
> hoodie.table.partition.fields=
>  {code}
> 4. check spark.sql.sources.schema.partCol.0 that stored in table TABLE_PARAMS of the HiveMetaStore .
> {code:java}
> |50|spark.sql.sources.schema.partCol.0|
>  {code}
> it has a value "".
> *Expected behavior*
> this is no hoodie.table.partition.fields in hoodie.properties and spark.sql.sources.schema.partCol.0  in HiveMetastore
> *Environment Description*
>  * Hudi version : 0.10.0
>  * Spark version : 3.2.1
>  * Hive version : 3.1.2
>  * Hadoop version : 3.3.1
>  * Storage (HDFS/S3/GCS..) : HDFS
>  * Running on Docker? (yes/no) : no



--
This message was sent by Atlassian Jira
(v8.20.10#820010)