You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by kiszk <gi...@git.apache.org> on 2018/08/03 13:41:31 UTC

[GitHub] spark pull request #21933: [SPARK-24917][CORE] make chunk size configurable

Github user kiszk commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21933#discussion_r207547545
  
    --- Diff: core/src/main/scala/org/apache/spark/serializer/SerializerManager.scala ---
    @@ -70,6 +70,8 @@ private[spark] class SerializerManager(
       private[this] val compressRdds = conf.getBoolean("spark.rdd.compress", false)
       // Whether to compress shuffle output temporarily spilled to disk
       private[this] val compressShuffleSpill = conf.getBoolean("spark.shuffle.spill.compress", true)
    +  // Size of the chunks to be used in the ChunkedByteBuffer
    +  private[this] val chunkSizeMb = conf.getSizeAsMb("spark.memory.chunkSize", "4m").toInt
    --- End diff --
    
    The name `spark.memory.chunkSize` looks too generic.
    How about `spark.memory.serializer.chunkSize` or others?


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org