You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@iceberg.apache.org by "jackye1995 (via GitHub)" <gi...@apache.org> on 2023/04/27 03:49:21 UTC

[GitHub] [iceberg] jackye1995 commented on a diff in pull request #7422: Spark 3.4: Support rate limit in Spark Streaming

jackye1995 commented on code in PR #7422:
URL: https://github.com/apache/iceberg/pull/7422#discussion_r1178595519


##########
spark/v3.4/spark/src/main/java/org/apache/iceberg/spark/SparkReadConf.java:
##########
@@ -255,6 +255,22 @@ public Long endTimestamp() {
     return confParser.longConf().option(SparkReadOptions.END_TIMESTAMP).parseOptional();
   }
 
+  public Integer maxFilesPerMicroBatch() {

Review Comment:
   I wonder if this is the right approach. Using boxed reference for many configs simply avoids the need to use extreme values like int/long min/max. I feel it makes more sense to keep that behavior and remove the usage of `Long.MIN_VALUE` here? Any thoughts?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org