You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Sai Prasanna <an...@gmail.com> on 2014/03/26 06:36:38 UTC

Spark executor memory & relationship with worker threads

Hi All,
Does number of worker threads bear any relationship to setting executor
memory ?.
I have a 16 GB RAM, with an 8-core processor. I had set SPARK_MEM to 12g
and was running locally with default 1 thread.
So this means there can be maximum one executor in one node scheduled at
any point of time. If I increase the number of worker threads to say 4, do
I need to reduce SPARK_MEM to 3g or I need not ???

Is there any performance difference between running in interactive spark
shell to non-interactive standalone spark apart from building time(sbt…)
???

THANKS !!