You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Daniel Mateus Pires (JIRA)" <ji...@apache.org> on 2018/09/20 10:01:00 UTC
[jira] [Updated] (SPARK-25480) Dynamic partitioning + saveAsTable
with multiple partition columns create empty directory
[ https://issues.apache.org/jira/browse/SPARK-25480?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Daniel Mateus Pires updated SPARK-25480:
----------------------------------------
Attachment: dynamic_partitioning.json
> Dynamic partitioning + saveAsTable with multiple partition columns create empty directory
> -----------------------------------------------------------------------------------------
>
> Key: SPARK-25480
> URL: https://issues.apache.org/jira/browse/SPARK-25480
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.3.0
> Reporter: Daniel Mateus Pires
> Priority: Minor
> Attachments: dynamic_partitioning.json
>
>
> We use .saveAsTable and dynamic partitioning as our only way to write data to S3 from Spark.
> When only 1 partition column is defined for a table, .saveAsTable behaves as expected:
> - with Overwrite mode it will create a table if it doesn't exist and write the data
> - with Append mode it will append to a given partition
> - with Overwrite mode if the table exists it will overwrite the partition
> If 2 partition columns are used however, the directory is created on S3 with the SUCCESS file, but no data is actually written
> our solution is to check if the table doesn't exist, and in that case, set the partitioning mode back to static before running saveAsTable:
> {code}
> spark.conf.set("spark.sql.sources.partitionOverwriteMode", "dynamic")
> df.write.mode("overwrite").partitionBy("year", "month").option("path", "s3://hbc-data-warehouse/integration/users_test").saveAsTable("users_test")
> {code}
>
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org