You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by "chenbodeng719 (via GitHub)" <gi...@apache.org> on 2023/04/01 13:54:25 UTC

[GitHub] [hudi] chenbodeng719 commented on issue #8166: [SUPPORT] Hudi Bucket Index

chenbodeng719 commented on issue #8166:
URL: https://github.com/apache/hudi/issues/8166#issuecomment-1492976098

   @KnightChess  Hi, I use below conf to test bulk insert. There comes out only one parquet. Did I miss something? I expect 5 parquets( 5 buckets). My dataset is about 120GB.
   ```
   
           CREATE TABLE hbase2hudi_sink(
               uid STRING PRIMARY KEY NOT ENFORCED,
               oridata STRING,
               update_time TIMESTAMP_LTZ(3)
           ) WITH (
               'table.type' = 'MERGE_ON_READ',
               'connector' = 'hudi',
               'path' = '%s',
               'write.operation' = 'bulk_insert',
               'precombine.field' = 'update_time',
               'write.tasks' = '2',
               'index.type' = 'BUCKET',
               'hoodie.bucket.index.hash.field' = 'uid',
               'hoodie.bucket.index.num.buckets' = '5'
           )
   
   ```
   <img width="835" alt="image" src="https://user-images.githubusercontent.com/104059106/229291867-c6c4f9fa-1183-4adb-838b-c72684868b6f.png">
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org