You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2018/07/19 17:50:00 UTC

[jira] [Assigned] (SPARK-24860) Expose dynamic partition overwrite per write operation

     [ https://issues.apache.org/jira/browse/SPARK-24860?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-24860:
------------------------------------

    Assignee:     (was: Apache Spark)

> Expose dynamic partition overwrite per write operation
> ------------------------------------------------------
>
>                 Key: SPARK-24860
>                 URL: https://issues.apache.org/jira/browse/SPARK-24860
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.3.1
>            Reporter: koert kuipers
>            Priority: Minor
>
> This is a follow up to issue SPARK-20236
> Also see the discussion in pullreq https://github.com/apache/spark/pull/18714
> SPARK-20236 added a global setting spark.sql.sources.partitionOverwriteMode to switch between static and dynamic overwrite of partitioned tables. It would be nice if we could choose per partitioned overwrite operation whether it's behavior is static or dynamic. The suggested syntax is:
> {noformat}
> df.write.option("partitionOverwriteMode", "dynamic").parquet...{noformat}
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org