You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by "weimingdiit (via GitHub)" <gi...@apache.org> on 2023/02/14 01:26:14 UTC

[GitHub] [hudi] weimingdiit commented on a diff in pull request #7362: [HUDI-5315] The record size is dynamically estimated when the table i…

weimingdiit commented on code in PR #7362:
URL: https://github.com/apache/hudi/pull/7362#discussion_r1105193398


##########
hudi-client/hudi-spark-client/src/main/java/org/apache/hudi/table/action/commit/BaseSparkCommitActionExecutor.java:
##########
@@ -418,4 +426,23 @@ public Partitioner getLayoutPartitioner(WorkloadProfile profile, String layoutPa
   protected void runPrecommitValidators(HoodieWriteMetadata<HoodieData<WriteStatus>> writeMetadata) {
     SparkValidatorUtils.runValidators(config, writeMetadata, context, table, instantTime);
   }
+
+  private int dynamicSampleRecordSize(JavaRDD<HoodieRecord<T>> inputRecords) {
+    int dynamicSampleRecordSize = config.getCopyOnWriteRecordSizeEstimate();
+    long inputRecordsCount = inputRecords.count();
+    if (inputRecordsCount == 0) {
+      LOG.warn("inputRecords is empty.");
+      return dynamicSampleRecordSize;
+    }
+    int maxSampleRecordNum = (int) Math.ceil(Math.min(inputRecordsCount * config.getRecordSizeDynamicSamplingRatio(), config.getRecordSizeDynamicSamplingMaxnum()));
+    try {
+      List<HoodieRecord<T>> sampleRecords = inputRecords.takeSample(false, maxSampleRecordNum);
+      dynamicSampleRecordSize = (int) (ObjectSizeCalculator.getObjectSize(sampleRecords) / maxSampleRecordNum);

Review Comment:
   @alexeykudinkin  thanks Alexey! Got it. I'll optimize this code



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org