You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by kiszk <gi...@git.apache.org> on 2018/10/23 12:21:38 UTC
[GitHub] spark pull request #22754: [SPARK-25776][CORE]The disk write buffer size mus...
Github user kiszk commented on a diff in the pull request:
https://github.com/apache/spark/pull/22754#discussion_r227363331
--- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
@@ -495,8 +495,8 @@ package object config {
ConfigBuilder("spark.shuffle.spill.diskWriteBufferSize")
.doc("The buffer size, in bytes, to use when writing the sorted records to an on-disk file.")
.bytesConf(ByteUnit.BYTE)
- .checkValue(v => v > 0 && v <= Int.MaxValue,
- s"The buffer size must be greater than 0 and less than ${Int.MaxValue}.")
+ .checkValue(v => v > 12 && v <= Int.MaxValue,
+ s"The buffer size must be greater than 12 and less than ${Int.MaxValue}.")
--- End diff --
Sorry for bothering you. Can we handle this in this PR, too?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org