You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by "sivabalan narayanan (Jira)" <ji...@apache.org> on 2021/01/26 16:23:01 UTC

[jira] [Updated] (HUDI-1374) hudi table support dynamic partitioning

     [ https://issues.apache.org/jira/browse/HUDI-1374?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

sivabalan narayanan updated HUDI-1374:
--------------------------------------
    Labels:   (was: user-supp)

> hudi table support dynamic partitioning
> ---------------------------------------
>
>                 Key: HUDI-1374
>                 URL: https://issues.apache.org/jira/browse/HUDI-1374
>             Project: Apache Hudi
>          Issue Type: Sub-task
>          Components: Spark Integration
>            Reporter: liwei
>            Assignee: liwei
>            Priority: Major
>
> disscuss in [https://github.com/apache/hudi/pull/2196]
> A. Thanks so much. This pr need to solved the issue with better approach. 
>  Now I am more clear about overwrite semantic between table.overwrite and spark sql overwrite for hudi.
> B. Also spark sql for hudi overwrite should have the ability just like spark sql 、hive 、 delta lake.
>  these engine have three mode for overwrite about partition:
>  1. Dynamic Partition : delete all partition data ,and the insert the new data for different 
>  2. Static partition: just overwrite the partition which is user specified
>  3. Mixed partition: mixed of 1 and 2
>  more detail in : 
>  [https://spark.apache.org/docs/3.0.0-preview/sql-ref-syntax-dml-insert-overwrite-table.html]
>  [https://www.programmersought.com/article/47155360487/]
> Just fyi, in the [RFC|https://cwiki.apache.org/confluence/display/HUDI/RFC+-+18+Insert+Overwrite+API#RFC18InsertOverwriteAPI-API] we discussed having 'insert_overwrite_table' operation to support dynamic partitioning. static partitioning is supported by 'insert_overwrite'.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)