You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Shams ul Haque <sh...@cashcare.in> on 2016/05/04 10:03:36 UTC

restrict my spark app to run on specific machines

Hi,

I have a cluster of 4 machines for Spark. I want my Spark app to run on 2
machines only. And rest 2 machines for other Spark apps.
So my question is, can I restrict my app to run on that 2 machines only by
passing some IP at the time of setting SparkConf or by any other setting?


Thanks,
Shams

Re: restrict my spark app to run on specific machines

Posted by Ted Yu <yu...@gmail.com>.
Please refer to:
https://spark.apache.org/docs/latest/running-on-yarn.html

You can setup spark.yarn.am.nodeLabelExpression and
spark.yarn.executor.nodeLabelExpression corresponding to the 2 machines.

On Wed, May 4, 2016 at 3:03 AM, Shams ul Haque <sh...@cashcare.in> wrote:

> Hi,
>
> I have a cluster of 4 machines for Spark. I want my Spark app to run on 2
> machines only. And rest 2 machines for other Spark apps.
> So my question is, can I restrict my app to run on that 2 machines only by
> passing some IP at the time of setting SparkConf or by any other setting?
>
>
> Thanks,
> Shams
>