You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Lan Jiang <lj...@gmail.com> on 2015/10/06 21:15:14 UTC

Spark cache memory storage

Hi, there

My understanding is that the cache storage is calculated as following

executor heap size * spark.storage.safetyFraction *
spark.storage.memoryFraction.

The default value for safetyFraction is 0.9 and memoryFraction is 0.6. When
I started a spark job on YARN, I set executor-memory to be 6g. thus I
expect the memory cache to be 6 * 0.9 * 0.6 = 3.24g. However, on the Spark
history server, it shows the reserved cached size for each executor is
3.1g. So it does not add up. What do I miss?

Lan