You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by kiszk <gi...@git.apache.org> on 2018/08/03 12:52:39 UTC
[GitHub] spark pull request #21087: [SPARK-23997][SQL] Configurable maximum number of...
Github user kiszk commented on a diff in the pull request:
https://github.com/apache/spark/pull/21087#discussion_r207534444
--- Diff: sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala ---
@@ -580,6 +580,11 @@ object SQLConf {
.booleanConf
.createWithDefault(true)
+ val BUCKETING_MAX_BUCKETS = buildConf("spark.sql.bucketing.maxBuckets")
+ .doc("The maximum number of buckets allowed. Defaults to 100000")
+ .longConf
--- End diff --
Why is this type `long` while the type of `numBuckets` is `Int`?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org