You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by "liwei (Jira)" <ji...@apache.org> on 2020/11/05 16:35:00 UTC

[jira] [Created] (HUDI-1374) spark sql for hudi support Static partition mode

liwei created HUDI-1374:
---------------------------

             Summary: spark sql for hudi support Static partition mode
                 Key: HUDI-1374
                 URL: https://issues.apache.org/jira/browse/HUDI-1374
             Project: Apache Hudi
          Issue Type: Sub-task
          Components: Spark Integration
            Reporter: liwei


disscuss in https://github.com/apache/hudi/pull/2196


A. Thanks so much. This pr need to solved the issue with better approach. 
Now I am more clear about overwrite semantic between table.overwrite and spark sql overwrite for hudi.

B. Also spark sql for hudi overwrite should have the ability just like spark sql 、hive 、 delta lake.
these engine have three mode for overwrite about partition:
1. Dynamic Partition : delete all partition data ,and the insert the new data for different 
2. Static partition: just overwrite the partition which is user specified
3. Mixed partition: mixed of 1 and 2
more detail in : 
https://spark.apache.org/docs/3.0.0-preview/sql-ref-syntax-dml-insert-overwrite-table.html
https://www.programmersought.com/article/47155360487/

C. our plan
1. Now spark sql for hudi overwrite is Dynamic Partition. I will resolved it in this issue HUDI-1349, first support delete all partition in HUDI-1350, then land HUDI-1349
2. Now spark sql for hudi does not support "Static partition" mode, will then land it in



--
This message was sent by Atlassian Jira
(v8.3.4#803005)