You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Oskar Rynkiewicz (JIRA)" <ji...@apache.org> on 2019/02/12 19:01:00 UTC

[jira] [Created] (SPARK-26863) Add minimal values for spark.driver.memory and spark.executor.memory

Oskar Rynkiewicz created SPARK-26863:
----------------------------------------

             Summary: Add minimal values for spark.driver.memory and spark.executor.memory
                 Key: SPARK-26863
                 URL: https://issues.apache.org/jira/browse/SPARK-26863
             Project: Spark
          Issue Type: Documentation
          Components: Documentation
    Affects Versions: 2.4.0
            Reporter: Oskar Rynkiewicz


I propose to change `1g` to `1g, with minimum of 472m` in "Default" column for spark.driver.memory and spark.executor.memory properties in [Application Properties](https://spark.apache.org/docs/latest/configuration.html#application-properties).

Reasoning:

In UnifiedMemoryManager.scala file I see definition of `RESERVED_SYSTEM_MEMORY_BYTES`:

```
// Set aside a fixed amount of memory for non-storage, non-execution purposes.
// This serves a function similar to `spark.memory.fraction`, but guarantees that we reserve
// sufficient memory for the system even for small heaps. E.g. if we have a 1GB JVM, then
// the memory used for execution and storage will be (1024 - 300) * 0.6 = 434MB by default.
private val RESERVED_SYSTEM_MEMORY_BYTES = 300 * 1024 * 1024
```

Then `reservedMemory` takes on this value and also `minSystemMemory` is defined: 

```
val minSystemMemory = (reservedMemory * 1.5).ceil.toLong
```

Consequently driver heap size and executor memory are checked if they are bigger than  minSystemMemory (471859200B) or IllegalArgumentException is thrown.

It seems that 472MB is absolute minimum for `spark.executor.memory` (`--driver-memory`) and `spark.executor.memory` (`--executor-memory`). 

Side question: how is this 472MB established as sufficient memory for small heaps? What do I risk if I build Spark with smaller RESERVED_SYSTEM_MEMORY_BYTES?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org