You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Yingkai Hu <yi...@gmail.com> on 2014/11/26 01:19:07 UTC

Submitting job from local to EC2 cluster

Hi All,

I have spark deployed to an EC2 cluster and were able to run jobs successfully when drive is reside within the cluster. However, job was killed when I tried to submit it from local. My guess is spark cluster can’t open connection back to the driver since it is on my machine.

I’m wondering if spark actually support submitting jobs from local? If so, would you please advise?

Many thanks in advance!

YK
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Submitting job from local to EC2 cluster

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
Yes, it is possible to submit jobs to a remote spark cluster. Just make
sure you follow the below steps.

1. Set spark.driver.host to your local ip (Where you runs your code, and it
should be accessible from the cluster)

2. Make sure no firewall/router configurations are blocking/filtering the
connection between your windows machine and the cluster. Best way to test
would be to ping the windows machine's public ip from your cluster. (And if
the pinging is working, then make sure you are portforwaring the required
ports)

3. Also set spark.driver.port if you don't want to open up all the ports on
your windows machine (default is random, so stick to one port)


Thanks
Best Regards

On Wed, Nov 26, 2014 at 5:49 AM, Yingkai Hu <yi...@gmail.com> wrote:

> Hi All,
>
> I have spark deployed to an EC2 cluster and were able to run jobs
> successfully when drive is reside within the cluster. However, job was
> killed when I tried to submit it from local. My guess is spark cluster
> can’t open connection back to the driver since it is on my machine.
>
> I’m wondering if spark actually support submitting jobs from local? If so,
> would you please advise?
>
> Many thanks in advance!
>
> YK
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>