You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by rdblue <gi...@git.apache.org> on 2018/08/24 18:21:09 UTC

[GitHub] spark pull request #21977: [SPARK-25004][CORE] Add spark.executor.pyspark.me...

Github user rdblue commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21977#discussion_r212714476
  
    --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala ---
    @@ -114,6 +114,10 @@ package object config {
         .checkValue(_ >= 0, "The off-heap memory size must not be negative")
         .createWithDefault(0)
     
    +  private[spark] val PYSPARK_EXECUTOR_MEMORY = ConfigBuilder("spark.executor.pyspark.memory")
    --- End diff --
    
    Yes, it should. I'll fix it.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org