You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Yadid Ayzenberg <ya...@media.mit.edu> on 2014/02/10 15:48:08 UTC

problem running multiple executors on large machine

Hi All,

I have a setup which consists of 8 small machines (1  core) and 8G RAM 
and 1 large machine (8 cores) with 100G RAM. Is there a way to enable 
spark to run multiple executors on the large machine, and a single 
executor on each of the small machines ?

Alternatively, is is possible to run a single executor that will utilize 
all cores and available memory on the large machine as well as executors 
with less memory on the smaller machines?

I tried configuring spark-env.sh on the large machine, but java -Xmx is 
configured uniformly for the entire cluster.
Is there any way to configure -Xmx separately for each machine ?


Thanks,

Yadid