You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Latha Appanna <la...@gmail.com> on 2019/07/26 04:43:49 UTC

[spark standalone mode] force spark to launch driver in a specific worker in cluster mode

Hello,

I'm looking for ways to configure spark-master to launch *driver* in a
specific  spark-worker in *cluster* deploy mode.  Say, I have master1,
worker1 and worker2. I want spark-master to always launch driver in worker2
in deploymode cluster and in spark standalone mode. Please let me know what
spark configurations need to be set to achieve this.


Thanks & Regards,
Latha

Re: [spark standalone mode] force spark to launch driver in a specific worker in cluster mode

Posted by Shamshad Ansari <sa...@accureanalytics.com>.
spark.driver.host (local hostname) Hostname or IP address for the driver.
This is used for communicating with the executors and the standalone Master.




On Fri, Jul 26, 2019 at 12:43 AM Latha Appanna <la...@gmail.com>
wrote:

> Hello,
>
> I'm looking for ways to configure spark-master to launch *driver* in a
> specific  spark-worker in *cluster* deploy mode.  Say, I have master1,
> worker1 and worker2. I want spark-master to always launch driver in worker2
> in deploymode cluster and in spark standalone mode. Please let me know what
> spark configurations need to be set to achieve this.
>
>
> Thanks & Regards,
> Latha
>
>