You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by varuni gang <va...@gmail.com> on 2016/01/27 16:00:25 UTC

help with enabling spark dynamic allocation

Hi,

As per spark documentation for spark's Dynamic Resource Allocation. I did
the following to enable shuffle/ Dynamic allocation service:

A) Added the following lines to "spark-defaults.conf"

enabling dynamic resource allocation and shuffle service
spark.dynamicAllocation.enabled=true
spark.dynamicAllocation.initialExecutors=10
spark.dynamicAllocation.minExecutors=5
spark.shuffle.service.enabled=true

B) Added the following to "yarn-site.xml"
<property>
<name>yarn.nodemanager.aux-services</name>
<value>spark_shuffle</value>
</property>
<property>
<name>yarn.nodemanager.aux-services.spark_shuffle.class</name>
<value>org.apache.spark.network.yarn.YarnShuffleService</value>
</property>

C) removed "--num-executors" from the arguments, while running the spark job

My spark version I am using to fire jobs is 1.4 while the spark yarn
shuffle on the cluster is at version 1.3.

After running the job I found that the jobs being run are not accepting set
number of initialExecutors executor as well as uses only 3 executors while
the minimum I've provided in the default config is 5. Does that mean that
my Dynamic allocation service is not working correctly? What must I do to
configure it correctly?

Thanks!
Varuni

Re: help with enabling spark dynamic allocation

Posted by Saisai Shao <sa...@gmail.com>.
You should also check the available YARN resources, overall the number of
containers can be allocated is restricted by Yarn resources. I guess here
your Yarn cluster resources can only allocate 3 containers, even if you set
the initial number to 10, still it cannot be satisfied.


On Wed, Jan 27, 2016 at 11:00 PM, varuni gang <va...@gmail.com> wrote:

> Hi,
>
> As per spark documentation for spark's Dynamic Resource Allocation. I did
> the following to enable shuffle/ Dynamic allocation service:
>
> A) Added the following lines to "spark-defaults.conf"
>
> enabling dynamic resource allocation and shuffle service
> spark.dynamicAllocation.enabled=true
> spark.dynamicAllocation.initialExecutors=10
> spark.dynamicAllocation.minExecutors=5
> spark.shuffle.service.enabled=true
>
> B) Added the following to "yarn-site.xml"
> <property>
> <name>yarn.nodemanager.aux-services</name>
> <value>spark_shuffle</value>
> </property>
> <property>
> <name>yarn.nodemanager.aux-services.spark_shuffle.class</name>
> <value>org.apache.spark.network.yarn.YarnShuffleService</value>
> </property>
>
> C) removed "--num-executors" from the arguments, while running the spark
> job
>
> My spark version I am using to fire jobs is 1.4 while the spark yarn
> shuffle on the cluster is at version 1.3.
>
> After running the job I found that the jobs being run are not accepting
> set number of initialExecutors executor as well as uses only 3 executors
> while the minimum I've provided in the default config is 5. Does that mean
> that my Dynamic allocation service is not working correctly? What must I do
> to configure it correctly?
>
> Thanks!
> Varuni
>
>