You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "zzzzming95 (Jira)" <ji...@apache.org> on 2022/08/13 06:29:00 UTC

[jira] [Commented] (SPARK-40062) Spark - Creating Sub Folder while writing to Partitioned Hive Table

    [ https://issues.apache.org/jira/browse/SPARK-40062?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17579217#comment-17579217 ] 

zzzzming95 commented on SPARK-40062:
------------------------------------

I can't reproduce the problem.
Can you give the specific operations to reproduce, such as how to write to the partition.

> Spark - Creating Sub Folder while writing to Partitioned Hive Table
> -------------------------------------------------------------------
>
>                 Key: SPARK-40062
>                 URL: https://issues.apache.org/jira/browse/SPARK-40062
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 2.4.7
>            Reporter: dinesh sachdev
>            Priority: Minor
>
> We had been writing to a Partitioned Hive Table and realized that data is being written has sub-folder.
> For ex- Refer Table definition as below - 
> _Create table T1 ( name string, address string) Partitioned by (process_date string) stored as parquet location '/mytable/a/b/c/org=employee';_
>  
> While writing to table HDFS path being written looks something like this - 
> {_}/mytable/a/b/c/org=employee/{_}{_}process_date=20220812/{_}{color:#de350b}_org=employee_{color}
>  
> The unnecessary addition of  _org=employee_ after process_date partition is because Hive Table has location consisting "=" operator, which Hive uses as syntax to determine partition column.
> Re-defining Table resolves above problem - 
> _Create table T1 ( name string, address string) Partitioned by (process_date string) stored as parquet location '/mytable/a/b/c/employee';_



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org