You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/05/01 17:42:20 UTC

[GitHub] [spark] tgravescs commented on a change in pull request #23695: [SPARK-26780][CORE]Improve shuffle read using ReadAheadInputStream

tgravescs commented on a change in pull request #23695: [SPARK-26780][CORE]Improve shuffle read using ReadAheadInputStream 
URL: https://github.com/apache/spark/pull/23695#discussion_r280150222
 
 

 ##########
 File path: core/src/main/scala/org/apache/spark/storage/ShuffleBlockFetcherIterator.scala
 ##########
 @@ -474,6 +475,12 @@ final class ShuffleBlockFetcherIterator(
               // TODO: manage the memory used here, and spill it into disk in case of OOM.
               Utils.copyStream(input, out, closeStreams = true)
               input = out.toChunkedByteBuffer.toInputStream(dispose = true)
+            } else {
+              val readAheadEnabled = SparkEnv.get != null &&
+                SparkEnv.get.conf.get(config.SHUFFLE_READ_AHEAD)
+              if (readAheadEnabled) {
+                input = new ReadAheadInputStream(input, 1024 * 1024)
 
 Review comment:
   yes I think we should document this behavior, also agree with some of the other comments not sure we need the enabled config, perhaps the buffer size one and if its set to 0 then disabled.  How did you choose the buffer size, just take it from the existing spill reader config? did you try different buffer sizes?

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org