You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Hlib Mykhailenko <hl...@inria.fr> on 2014/11/26 14:55:09 UTC

force spark to use all available memory on each node

Hello, 

Spark has 'spark.executor.memory' property which defines amount of memory which will be used on each computational node. 
And by default it is equal to 512Mb. Is there way to tell spark to use 'all available memory minus 1Gb'? 

Thank you in advance. 
-- 
Cordialement, 
Hlib Mykhailenko 
Doctorant à INRIA Sophia-Antipolis Méditerranée 
2004 Route des Lucioles BP93 
06902 SOPHIA ANTIPOLIS cedex