You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Sagar <sa...@yonsei.ac.kr> on 2015/07/28 14:01:54 UTC

Clustetr setup for SPARK standalone application:

Hello Sir,

 

I am MS student working on SPARK.

I am totally new in this field.

I have install the spark.

 

The local  spark-shell is working fine.

 

But whenever I tried the Master configuration I got some errors.

 

When I run this command ;

MASTER=spark://hadoopm0:7077 spark-shell

 

I gets the errors likes;

 

15/07/27 21:17:26 INFO AppClient$ClientActor: Connecting to master
spark://hadoopm0:7077...

15/07/27 21:17:46 ERROR SparkDeploySchedulerBackend: Application has been
killed. Reason: All masters are unresponsive! Giving up.

15/07/27 21:17:46 WARN SparkDeploySchedulerBackend: Application ID is not
initialized yet.

15/07/27 21:17:46 ERROR TaskSchedulerImpl: Exiting due to error from cluster
scheduler: All masters are unresponsive! Giving up.

 

Also I have attached the my screenshot of Master UI.

 

Can you please give me some references, documentations or  how to solve this
issue.

Thanks in advance.

Thanking You,

 



---
This email has been checked for viruses by Avast antivirus software.
http://www.avast.com

Re: Clustetr setup for SPARK standalone application:

Posted by Dean Wampler <de...@gmail.com>.
When you say you installed Spark, did you install the master and slave
services for standalone mode as described here
<http://spark.apache.org/docs/latest/spark-standalone.html>? If you
intended to run Spark on Hadoop, see here
<http://spark.apache.org/docs/latest/running-on-yarn.html>.

It looks like either the master service isn't running or isn't reachable
over your network. Is hadoopm0 publicly routable? Is port 7077 blocked? As
a test, can you telnet to it?
    telnet hadoopm0 7077



Dean Wampler, Ph.D.
Author: Programming Scala, 2nd Edition
<http://shop.oreilly.com/product/0636920033073.do> (O'Reilly)
Typesafe <http://typesafe.com>
@deanwampler <http://twitter.com/deanwampler>
http://polyglotprogramming.com

On Tue, Jul 28, 2015 at 7:01 AM, Sagar <sa...@yonsei.ac.kr> wrote:

> Hello Sir,
>
>
>
> I am MS student working on SPARK.
>
> I am totally new in this field.
>
> I have install the spark.
>
>
>
> The local  spark-shell is working fine.
>
>
>
> But whenever I tried the Master configuration I got some errors.
>
>
>
> When I run this command ;
>
> MASTER=spark://hadoopm0:7077 spark-shell
>
>
>
> I gets the errors likes;
>
>
>
> 15/07/27 21:17:26 INFO AppClient$ClientActor: Connecting to master
> spark://hadoopm0:7077...
>
> 15/07/27 21:17:46 ERROR SparkDeploySchedulerBackend: Application has been
> killed. Reason: All masters are unresponsive! Giving up.
>
> 15/07/27 21:17:46 WARN SparkDeploySchedulerBackend: Application ID is not
> initialized yet.
>
> 15/07/27 21:17:46 ERROR TaskSchedulerImpl: Exiting due to error from
> cluster scheduler: All masters are unresponsive! Giving up.
>
>
>
> Also I have attached the my screenshot of Master UI.
>
>
>
> Can you please give me some references, documentations or  how to solve
> this issue.
>
> Thanks in advance.
>
> Thanking You,
>
>
>
>
> ------------------------------
>   [image: Avast logo] <http://www.avast.com/>
>
> This email has been checked for viruses by Avast antivirus software.
> www.avast.com
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>