You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Theodore Si <sj...@gmail.com> on 2014/10/24 09:04:06 UTC

How can I set the IP a worker use?

Hi all,

I have two network interface card on one node, one is a Eithernet card, 
the other Infiniband HCA.
The master has two IP addresses, lets say 1.2.3.4 (for Eithernet card) 
and 2.3.4.5 (for HCA).
I can start the master by
export SPARK_MASTER_IP='1.2.3.4';sbin/start-master.sh
to let master listen on 1.2.3.4:7077

But when I connect the worker to the master by using
spark-1.0.1/bin/spark-class org.apache.spark.deploy.worker.Worker 
spark://1.2.3.4:7077

I will get errors, since it is using its HCA card. How can I let the 
worker use its Eithernet card?

Thanks

Re: How can I set the IP a worker use?

Posted by Theodore Si <sj...@gmail.com>.
I found this. So it seems that we should use -h or --host instead of -i 
and --ip.

  -i HOST, --ip IP         Hostname to listen on (deprecated, please use 
--host or -h)
   -h HOST, --host HOST     Hostname to listen on


在 10/24/2014 3:35 PM, Akhil Das 写道:
> Try using the --ip parameter 
> <http://spark.apache.org/docs/latest/spark-standalone.html#starting-a-cluster-manually> 
> while starting the worker. like:
>
> spark-1.0.1/bin/spark-class org.apache.spark.deploy.worker.Worker --ip 
> 1.2.3.4 spark://1.2.3.4:7077 <http://1.2.3.4:7077>
>
> Thanks
> Best Regards
>
> On Fri, Oct 24, 2014 at 12:34 PM, Theodore Si <sjyzhxw@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     Hi all,
>
>     I have two network interface card on one node, one is a Eithernet
>     card, the other Infiniband HCA.
>     The master has two IP addresses, lets say 1.2.3.4 (for Eithernet
>     card) and 2.3.4.5 (for HCA).
>     I can start the master by
>     export SPARK_MASTER_IP='1.2.3.4';sbin/start-master.sh
>     to let master listen on 1.2.3.4:7077 <http://1.2.3.4:7077>
>
>     But when I connect the worker to the master by using
>     spark-1.0.1/bin/spark-class org.apache.spark.deploy.worker.Worker
>     spark://1.2.3.4:7077 <http://1.2.3.4:7077>
>
>     I will get errors, since it is using its HCA card. How can I let
>     the worker use its Eithernet card?
>
>     Thanks
>
>


Re: How can I set the IP a worker use?

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
Try using the --ip parameter
<http://spark.apache.org/docs/latest/spark-standalone.html#starting-a-cluster-manually>
while starting the worker. like:

spark-1.0.1/bin/spark-class org.apache.spark.deploy.worker.Worker --ip
1.2.3.4 spark://1.2.3.4:7077

Thanks
Best Regards

On Fri, Oct 24, 2014 at 12:34 PM, Theodore Si <sj...@gmail.com> wrote:

>  Hi all,
>
> I have two network interface card on one node, one is a Eithernet card,
> the other Infiniband HCA.
> The master has two IP addresses, lets say 1.2.3.4 (for Eithernet card) and
> 2.3.4.5 (for HCA).
> I can start the master by
> export SPARK_MASTER_IP='1.2.3.4';sbin/start-master.sh
> to let master listen on 1.2.3.4:7077
>
> But when I connect the worker to the master by using
> spark-1.0.1/bin/spark-class org.apache.spark.deploy.worker.Worker spark://
> 1.2.3.4:7077
>
> I will get errors, since it is using its HCA card. How can I let the
> worker use its Eithernet card?
>
> Thanks
>