You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Divya Gehlot <di...@gmail.com> on 2016/09/14 06:07:36 UTC
how to specify cores and executor to run spark jobs simultaneously
Hi,
I am on EMR cluster and My cluster configuration is as below:
Number of nodes including master node - 3
Memory:22.50 GB
VCores Total : 16
Active Nodes : 2
Spark version- 1.6.1
Parameter set in spark-default.conf
spark.executor.instances 2
> spark.executor.cores 8
> spark.driver.memory 10473M
> spark.executor.memory 9658M
> spark.default.parallelism 32
Would let me know if need any other info regarding the cluster .
The current configuration for spark-submit is
--driver-memory 5G \
--executor-memory 2G \
--executor-cores 5 \
--num-executors 10 \
Currently with the above job configuration if I try to run another spark
job it will be in accepted state till the first one finishes .
How do I optimize or update the above spark-submit configurations to run
some more spark jobs simultaneously
Would really appreciate the help.
Thanks,
Divya
Re: how to specify cores and executor to run spark jobs simultaneously
Posted by Deepak Sharma <de...@gmail.com>.
I am not sure about EMR , but seems multi tenancy is not enabled in your
case.
Multi tenancy means all the applications has to be submitted to different
queues.
Thanks
Deepak
On Wed, Sep 14, 2016 at 11:37 AM, Divya Gehlot <di...@gmail.com>
wrote:
> Hi,
>
> I am on EMR cluster and My cluster configuration is as below:
> Number of nodes including master node - 3
> Memory:22.50 GB
> VCores Total : 16
> Active Nodes : 2
> Spark version- 1.6.1
>
> Parameter set in spark-default.conf
>
> spark.executor.instances 2
>> spark.executor.cores 8
>> spark.driver.memory 10473M
>> spark.executor.memory 9658M
>> spark.default.parallelism 32
>
>
> Would let me know if need any other info regarding the cluster .
>
> The current configuration for spark-submit is
> --driver-memory 5G \
> --executor-memory 2G \
> --executor-cores 5 \
> --num-executors 10 \
>
>
> Currently with the above job configuration if I try to run another spark
> job it will be in accepted state till the first one finishes .
> How do I optimize or update the above spark-submit configurations to run
> some more spark jobs simultaneously
>
> Would really appreciate the help.
>
> Thanks,
> Divya
>
>
--
Thanks
Deepak
www.bigdatabig.com
www.keosha.net