You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by we...@apache.org on 2017/07/08 16:28:08 UTC
spark git commit: [SPARK-21343] Refine the document for
spark.reducer.maxReqSizeShuffleToMem.
Repository: spark
Updated Branches:
refs/heads/master 9131bdb7e -> 062c336d0
[SPARK-21343] Refine the document for spark.reducer.maxReqSizeShuffleToMem.
## What changes were proposed in this pull request?
In current code, reducer can break the old shuffle service when `spark.reducer.maxReqSizeShuffleToMem` is enabled. Let's refine document.
Author: jinxing <ji...@126.com>
Closes #18566 from jinxing64/SPARK-21343.
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/062c336d
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/062c336d
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/062c336d
Branch: refs/heads/master
Commit: 062c336d06a0bd4e740a18d2349e03e311509243
Parents: 9131bdb
Author: jinxing <ji...@126.com>
Authored: Sun Jul 9 00:27:58 2017 +0800
Committer: Wenchen Fan <we...@databricks.com>
Committed: Sun Jul 9 00:27:58 2017 +0800
----------------------------------------------------------------------
.../scala/org/apache/spark/internal/config/package.scala | 6 ++++--
docs/configuration.md | 10 ++++++++++
2 files changed, 14 insertions(+), 2 deletions(-)
----------------------------------------------------------------------
http://git-wip-us.apache.org/repos/asf/spark/blob/062c336d/core/src/main/scala/org/apache/spark/internal/config/package.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/internal/config/package.scala b/core/src/main/scala/org/apache/spark/internal/config/package.scala
index a629810..512d539 100644
--- a/core/src/main/scala/org/apache/spark/internal/config/package.scala
+++ b/core/src/main/scala/org/apache/spark/internal/config/package.scala
@@ -323,9 +323,11 @@ package object config {
private[spark] val REDUCER_MAX_REQ_SIZE_SHUFFLE_TO_MEM =
ConfigBuilder("spark.reducer.maxReqSizeShuffleToMem")
- .internal()
.doc("The blocks of a shuffle request will be fetched to disk when size of the request is " +
- "above this threshold. This is to avoid a giant request takes too much memory.")
+ "above this threshold. This is to avoid a giant request takes too much memory. We can " +
+ "enable this config by setting a specific value(e.g. 200m). Note that this config can " +
+ "be enabled only when the shuffle shuffle service is newer than Spark-2.2 or the shuffle" +
+ " service is disabled.")
.bytesConf(ByteUnit.BYTE)
.createWithDefault(Long.MaxValue)
http://git-wip-us.apache.org/repos/asf/spark/blob/062c336d/docs/configuration.md
----------------------------------------------------------------------
diff --git a/docs/configuration.md b/docs/configuration.md
index 7dc23e4..6ca8424 100644
--- a/docs/configuration.md
+++ b/docs/configuration.md
@@ -529,6 +529,16 @@ Apart from these, the following properties are also available, and may be useful
</td>
</tr>
<tr>
+ <td><code>spark.reducer.maxReqSizeShuffleToMem</code></td>
+ <td>Long.MaxValue</td>
+ <td>
+ The blocks of a shuffle request will be fetched to disk when size of the request is above
+ this threshold. This is to avoid a giant request takes too much memory. We can enable this
+ config by setting a specific value(e.g. 200m). Note that this config can be enabled only when
+ the shuffle shuffle service is newer than Spark-2.2 or the shuffle service is disabled.
+ </td>
+</tr>
+<tr>
<td><code>spark.shuffle.compress</code></td>
<td>true</td>
<td>
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org