You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by anny9699 <an...@gmail.com> on 2015/03/31 05:48:48 UTC

How to configure SparkUI to use internal ec2 ip

Hi,

For security reasons, we added a server between my aws Spark Cluster and
local, so I couldn't connect to the cluster directly. To see the SparkUI and
its related work's  stdout and stderr, I used dynamic forwarding and
configured the SOCKS proxy. Now I could see the SparkUI using the  internal
ec2 ip, however when I click on the application UI (4040) or the worker's UI
(8081), it still automatically uses the public DNS instead of internal ec2
ip, which the browser now couldn't show. 

Is there a way that I could configure this? I saw that one could configure
the LOCAL_ADDRESS_IP in the spark-env.sh, but not sure whether this could
help. Does anyone experience the same issue?

Thanks a lot!
Anny




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-configure-SparkUI-to-use-internal-ec2-ip-tp22311.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: How to configure SparkUI to use internal ec2 ip

Posted by Anny Chen <an...@gmail.com>.
Thanks Petar  and Akhil for the suggestion.

Actually I changed the SPARK_MASTER_IP to the internal-ip, deleted the
"export SPARK_PUBLIC_DNS=xxxxxx" line in the spark-env.sh and also edited
the /etc/hosts as Akhil suggested, and now it is working! However I don't
know which change actually makes it work.

Thanks!
Anny

On Tue, Mar 31, 2015 at 10:26 AM, Petar Zecevic <pe...@gmail.com>
wrote:

>
> Did you try setting the SPARK_MASTER_IP parameter in spark-env.sh?
>
>
>
> On 31.3.2015. 19:19, Anny Chen wrote:
>
> Hi Akhil,
>
>  I tried editing the /etc/hosts on the master and on the workers, and
> seems it is not working for me.
>
>  I tried adding <hostname> <internal-ip> and it didn't work. I then tried
> adding <internal-ip> <hostname> and it didn't work either. I guess I should
> also edit the spark-env.sh file?
>
>  Thanks!
> Anny
>
> On Mon, Mar 30, 2015 at 11:15 PM, Akhil Das <ak...@sigmoidanalytics.com>
> wrote:
>
>>  You can add an internal ip to public hostname mapping in your
>> /etc/hosts file, if your forwarding is proper then it wouldn't be a problem
>> there after.
>>
>>
>>
>>  Thanks
>> Best Regards
>>
>> On Tue, Mar 31, 2015 at 9:18 AM, anny9699 <an...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> For security reasons, we added a server between my aws Spark Cluster and
>>> local, so I couldn't connect to the cluster directly. To see the SparkUI
>>> and
>>> its related work's  stdout and stderr, I used dynamic forwarding and
>>> configured the SOCKS proxy. Now I could see the SparkUI using the
>>> internal
>>> ec2 ip, however when I click on the application UI (4040) or the
>>> worker's UI
>>> (8081), it still automatically uses the public DNS instead of internal
>>> ec2
>>> ip, which the browser now couldn't show.
>>>
>>> Is there a way that I could configure this? I saw that one could
>>> configure
>>> the LOCAL_ADDRESS_IP in the spark-env.sh, but not sure whether this could
>>> help. Does anyone experience the same issue?
>>>
>>> Thanks a lot!
>>> Anny
>>>
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-configure-SparkUI-to-use-internal-ec2-ip-tp22311.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: user-help@spark.apache.org
>>>
>>>
>>
>
>

Re: How to configure SparkUI to use internal ec2 ip

Posted by Petar Zecevic <pe...@gmail.com>.
Did you try setting the SPARK_MASTER_IP parameter in spark-env.sh?


On 31.3.2015. 19:19, Anny Chen wrote:
> Hi Akhil,
>
> I tried editing the /etc/hosts on the master and on the workers, and 
> seems it is not working for me.
>
> I tried adding <hostname> <internal-ip> and it didn't work. I then 
> tried adding <internal-ip> <hostname> and it didn't work either. I 
> guess I should also edit the spark-env.sh file?
>
> Thanks!
> Anny
>
> On Mon, Mar 30, 2015 at 11:15 PM, Akhil Das 
> <akhil@sigmoidanalytics.com <ma...@sigmoidanalytics.com>> wrote:
>
>     You can add an internal ip to public hostname mapping in your
>     /etc/hosts file, if your forwarding is proper then it wouldn't be
>     a problem there after.
>
>
>
>     Thanks
>     Best Regards
>
>     On Tue, Mar 31, 2015 at 9:18 AM, anny9699 <anny9699@gmail.com
>     <ma...@gmail.com>> wrote:
>
>         Hi,
>
>         For security reasons, we added a server between my aws Spark
>         Cluster and
>         local, so I couldn't connect to the cluster directly. To see
>         the SparkUI and
>         its related work's  stdout and stderr, I used dynamic
>         forwarding and
>         configured the SOCKS proxy. Now I could see the SparkUI using
>         the  internal
>         ec2 ip, however when I click on the application UI (4040) or
>         the worker's UI
>         (8081), it still automatically uses the public DNS instead of
>         internal ec2
>         ip, which the browser now couldn't show.
>
>         Is there a way that I could configure this? I saw that one
>         could configure
>         the LOCAL_ADDRESS_IP in the spark-env.sh, but not sure whether
>         this could
>         help. Does anyone experience the same issue?
>
>         Thanks a lot!
>         Anny
>
>
>
>
>         --
>         View this message in context:
>         http://apache-spark-user-list.1001560.n3.nabble.com/How-to-configure-SparkUI-to-use-internal-ec2-ip-tp22311.html
>         Sent from the Apache Spark User List mailing list archive at
>         Nabble.com.
>
>         ---------------------------------------------------------------------
>         To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>         <ma...@spark.apache.org>
>         For additional commands, e-mail: user-help@spark.apache.org
>         <ma...@spark.apache.org>
>
>
>


Re: How to configure SparkUI to use internal ec2 ip

Posted by Anny Chen <an...@gmail.com>.
Hi Akhil,

Thanks for the explanation! I could ping the worker from the master using
either host name or internal-ip, however I am a little confused why setting
SPARK_LOCAL_IP would help?

Thanks!
Anny

On Tue, Mar 31, 2015 at 10:36 AM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> When you say you added <internal-ip> <hostname>, where you able to ping
> any of these from the machine?
>
> You could try setting SPARK_LOCAL_IP on all machines. But make sure you
> will be able to bind to that host/ip specified there.
>
>
> Thanks
> Best Regards
>
> On Tue, Mar 31, 2015 at 10:49 PM, Anny Chen <an...@gmail.com> wrote:
>
>> Hi Akhil,
>>
>> I tried editing the /etc/hosts on the master and on the workers, and
>> seems it is not working for me.
>>
>> I tried adding <hostname> <internal-ip> and it didn't work. I then tried
>> adding <internal-ip> <hostname> and it didn't work either. I guess I should
>> also edit the spark-env.sh file?
>>
>> Thanks!
>> Anny
>>
>> On Mon, Mar 30, 2015 at 11:15 PM, Akhil Das <ak...@sigmoidanalytics.com>
>> wrote:
>>
>>> You can add an internal ip to public hostname mapping in your /etc/hosts
>>> file, if your forwarding is proper then it wouldn't be a problem there
>>> after.
>>>
>>>
>>>
>>> Thanks
>>> Best Regards
>>>
>>> On Tue, Mar 31, 2015 at 9:18 AM, anny9699 <an...@gmail.com> wrote:
>>>
>>>> Hi,
>>>>
>>>> For security reasons, we added a server between my aws Spark Cluster and
>>>> local, so I couldn't connect to the cluster directly. To see the
>>>> SparkUI and
>>>> its related work's  stdout and stderr, I used dynamic forwarding and
>>>> configured the SOCKS proxy. Now I could see the SparkUI using the
>>>> internal
>>>> ec2 ip, however when I click on the application UI (4040) or the
>>>> worker's UI
>>>> (8081), it still automatically uses the public DNS instead of internal
>>>> ec2
>>>> ip, which the browser now couldn't show.
>>>>
>>>> Is there a way that I could configure this? I saw that one could
>>>> configure
>>>> the LOCAL_ADDRESS_IP in the spark-env.sh, but not sure whether this
>>>> could
>>>> help. Does anyone experience the same issue?
>>>>
>>>> Thanks a lot!
>>>> Anny
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> View this message in context:
>>>> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-configure-SparkUI-to-use-internal-ec2-ip-tp22311.html
>>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>>> For additional commands, e-mail: user-help@spark.apache.org
>>>>
>>>>
>>>
>>
>

Re: How to configure SparkUI to use internal ec2 ip

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
When you say you added <internal-ip> <hostname>, where you able to ping any
of these from the machine?

You could try setting SPARK_LOCAL_IP on all machines. But make sure you
will be able to bind to that host/ip specified there.


Thanks
Best Regards

On Tue, Mar 31, 2015 at 10:49 PM, Anny Chen <an...@gmail.com> wrote:

> Hi Akhil,
>
> I tried editing the /etc/hosts on the master and on the workers, and seems
> it is not working for me.
>
> I tried adding <hostname> <internal-ip> and it didn't work. I then tried
> adding <internal-ip> <hostname> and it didn't work either. I guess I should
> also edit the spark-env.sh file?
>
> Thanks!
> Anny
>
> On Mon, Mar 30, 2015 at 11:15 PM, Akhil Das <ak...@sigmoidanalytics.com>
> wrote:
>
>> You can add an internal ip to public hostname mapping in your /etc/hosts
>> file, if your forwarding is proper then it wouldn't be a problem there
>> after.
>>
>>
>>
>> Thanks
>> Best Regards
>>
>> On Tue, Mar 31, 2015 at 9:18 AM, anny9699 <an...@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> For security reasons, we added a server between my aws Spark Cluster and
>>> local, so I couldn't connect to the cluster directly. To see the SparkUI
>>> and
>>> its related work's  stdout and stderr, I used dynamic forwarding and
>>> configured the SOCKS proxy. Now I could see the SparkUI using the
>>> internal
>>> ec2 ip, however when I click on the application UI (4040) or the
>>> worker's UI
>>> (8081), it still automatically uses the public DNS instead of internal
>>> ec2
>>> ip, which the browser now couldn't show.
>>>
>>> Is there a way that I could configure this? I saw that one could
>>> configure
>>> the LOCAL_ADDRESS_IP in the spark-env.sh, but not sure whether this could
>>> help. Does anyone experience the same issue?
>>>
>>> Thanks a lot!
>>> Anny
>>>
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-configure-SparkUI-to-use-internal-ec2-ip-tp22311.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: user-help@spark.apache.org
>>>
>>>
>>
>

Re: How to configure SparkUI to use internal ec2 ip

Posted by Anny Chen <an...@gmail.com>.
Hi Akhil,

I tried editing the /etc/hosts on the master and on the workers, and seems
it is not working for me.

I tried adding <hostname> <internal-ip> and it didn't work. I then tried
adding <internal-ip> <hostname> and it didn't work either. I guess I should
also edit the spark-env.sh file?

Thanks!
Anny

On Mon, Mar 30, 2015 at 11:15 PM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> You can add an internal ip to public hostname mapping in your /etc/hosts
> file, if your forwarding is proper then it wouldn't be a problem there
> after.
>
>
>
> Thanks
> Best Regards
>
> On Tue, Mar 31, 2015 at 9:18 AM, anny9699 <an...@gmail.com> wrote:
>
>> Hi,
>>
>> For security reasons, we added a server between my aws Spark Cluster and
>> local, so I couldn't connect to the cluster directly. To see the SparkUI
>> and
>> its related work's  stdout and stderr, I used dynamic forwarding and
>> configured the SOCKS proxy. Now I could see the SparkUI using the
>> internal
>> ec2 ip, however when I click on the application UI (4040) or the worker's
>> UI
>> (8081), it still automatically uses the public DNS instead of internal ec2
>> ip, which the browser now couldn't show.
>>
>> Is there a way that I could configure this? I saw that one could configure
>> the LOCAL_ADDRESS_IP in the spark-env.sh, but not sure whether this could
>> help. Does anyone experience the same issue?
>>
>> Thanks a lot!
>> Anny
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-configure-SparkUI-to-use-internal-ec2-ip-tp22311.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>

Re: How to configure SparkUI to use internal ec2 ip

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
You can add an internal ip to public hostname mapping in your /etc/hosts
file, if your forwarding is proper then it wouldn't be a problem there
after.



Thanks
Best Regards

On Tue, Mar 31, 2015 at 9:18 AM, anny9699 <an...@gmail.com> wrote:

> Hi,
>
> For security reasons, we added a server between my aws Spark Cluster and
> local, so I couldn't connect to the cluster directly. To see the SparkUI
> and
> its related work's  stdout and stderr, I used dynamic forwarding and
> configured the SOCKS proxy. Now I could see the SparkUI using the  internal
> ec2 ip, however when I click on the application UI (4040) or the worker's
> UI
> (8081), it still automatically uses the public DNS instead of internal ec2
> ip, which the browser now couldn't show.
>
> Is there a way that I could configure this? I saw that one could configure
> the LOCAL_ADDRESS_IP in the spark-env.sh, but not sure whether this could
> help. Does anyone experience the same issue?
>
> Thanks a lot!
> Anny
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/How-to-configure-SparkUI-to-use-internal-ec2-ip-tp22311.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>