You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "ChenTao (JIRA)" <ji...@apache.org> on 2019/06/12 08:58:00 UTC

[jira] [Created] (SPARK-28021) A unappropriate exception in StaticMemoryManager.getMaxExecutionMemory

ChenTao created SPARK-28021:
-------------------------------

             Summary: A unappropriate exception in StaticMemoryManager.getMaxExecutionMemory
                 Key: SPARK-28021
                 URL: https://issues.apache.org/jira/browse/SPARK-28021
             Project: Spark
          Issue Type: Question
          Components: Spark Core
    Affects Versions: 2.4.3
            Reporter: ChenTao


When i review StaticMemoryManager.scala, there comes a question to me.
{code:java}
private def getMaxExecutionMemory(conf: SparkConf): Long = {
  val systemMaxMemory = conf.getLong("spark.testing.memory", Runtime.getRuntime.maxMemory)

  if (systemMaxMemory < MIN_MEMORY_BYTES) {
    throw new IllegalArgumentException(s"System memory $systemMaxMemory must " +
      s"be at least $MIN_MEMORY_BYTES. Please increase heap size using the --driver-memory " +
      s"option or spark.driver.memory in Spark configuration.")
  }
  if (conf.contains("spark.executor.memory")) {
    val executorMemory = conf.getSizeAsBytes("spark.executor.memory")
    if (executorMemory < MIN_MEMORY_BYTES) {
      throw new IllegalArgumentException(s"Executor memory $executorMemory must be at least " +
        s"$MIN_MEMORY_BYTES. Please increase executor memory using the " +
        s"--executor-memory option or spark.executor.memory in Spark configuration.")
    }
  }
  val memoryFraction = conf.getDouble("spark.shuffle.memoryFraction", 0.2)
  val safetyFraction = conf.getDouble("spark.shuffle.safetyFraction", 0.8)
  (systemMaxMemory * memoryFraction * safetyFraction).toLong
}
{code}
When a executor tries to getMaxExecutionMemory, it should set systemMaxMemory by usingĀ Runtime.getRuntime.maxMemory first, then compares the value between systemMaxMemory and MIN_MEMORY_BYTES.

If the compared value is true, program thows an exception to remind user to increase heap size by using --driver-memory.

I wonder if it is wrong because the heap size of executors are setted by --executor-memory?

Although there is another exception about adjusting executor's memory below, i just think that the first exception may be notĀ appropriate.

Thanks for answering my question!



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org