You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiao Li (JIRA)" <ji...@apache.org> on 2018/01/03 14:21:00 UTC

[jira] [Resolved] (SPARK-20236) Overwrite a partitioned data source table should only overwrite related partitions

     [ https://issues.apache.org/jira/browse/SPARK-20236?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Xiao Li resolved SPARK-20236.
-----------------------------
       Resolution: Fixed
         Assignee: Wenchen Fan
    Fix Version/s: 2.3.0

> Overwrite a partitioned data source table should only overwrite related partitions
> ----------------------------------------------------------------------------------
>
>                 Key: SPARK-20236
>                 URL: https://issues.apache.org/jira/browse/SPARK-20236
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.2.0
>            Reporter: Wenchen Fan
>            Assignee: Wenchen Fan
>              Labels: releasenotes
>             Fix For: 2.3.0
>
>
> When we overwrite a partitioned data source table, currently Spark will truncate the entire table to write new data, or truncate a bunch of partitions according to the given static partitions.
> For example, {{INSERT OVERWRITE tbl ...}} will truncate the entire table, {{INSERT OVERWRITE tbl PARTITION (a=1, b)}} will truncate all the partitions that starts with {{a=1}}.
> This behavior is kind of reasonable as we can know which partitions will be overwritten before runtime. However, hive has a different behavior that it only overwrites related partitions, e.g. {{INSERT OVERWRITE tbl SELECT 1,2,3}} will only overwrite partition {{a=2, b=3}}, assuming {{tbl}} has only one data column and is partitioned by {{a}} and {{b}}.
> It seems better if we can follow hive's behavior.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org