You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Chen Jingci <cj...@gmail.com> on 2013/11/02 02:00:46 UTC

RE: Spark with HDFS: ERROR Worker: Connection to master failed! Shuttingdown.

Can you try to use the IP address instead of the name 'base'? I also experience the same problem before, it worked after changed to IP.

Thanks,
Chen jingci

-----Original Message-----
From: "Thorsten Bergler" <th...@tbonline.de>
Sent: ‎2/‎11/‎2013 2:03
To: "user@spark.incubator.apache.org" <us...@spark.incubator.apache.org>
Subject: Spark with HDFS: ERROR Worker: Connection to master failed! Shuttingdown.

Hello,


I am new to Spark and doing my first steps with it today.

Right now I am having trouble with the error:

ERROR Worker: Connection to master failed! Shutting down.

So far, I found out the following:

The standalone version of Spark (without Hadoop-HDFS and YARN) works perfectly. The master starts and also the worker starts and get registered at the master.

Logfile of the master:

Spark Command: /usr/lib/jvm/jdk//bin/java -cp :/home/thorsten/spark/conf:/home/thorsten/spark/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop1.0.4.jar -Djava.library.path= -Xms512m -Xmx512m org.apache.spark.deploy.master.Master --ip base --port 7077 --webui-port 8080
========================================

13/11/01 18:51:14 INFO Slf4jEventHandler: Slf4jEventHandler started
13/11/01 18:51:14 INFO Master: Starting Spark master at spark://base:7077
13/11/01 18:51:14 INFO MasterWebUI: Started Master web UI at http://base:8080
13/11/01 18:51:17 INFO Master: Registering worker base:32942 with 1 cores, 512.0 MB RAM


Because I want to try out Spark with HDFS too, I tried to compile it also that way:


SPARK_HADOOP_VERSION=2.2.0 sbt/sbt assemblyBut with this compiled version, the registration of workers at the master is not working anymore.

Here is the logfile of the slave:

Spark Command: java -cp :/home/thorsten/spark-hdfs/conf:/home/thorsten/spark-hdfs/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop2.2.0.jar -Djava.library.path= -Xms512m -Xmx512m org.apache.spark.deploy.worker.Worker spark://base:7077 --webui-port 8081
========================================

13/11/01 18:49:57 INFO Slf4jEventHandler: Slf4jEventHandler started
13/11/01 18:49:57 INFO Worker: Starting Spark worker base:37950 with 1 cores, 512.0 MB RAM
13/11/01 18:49:57 INFO Worker: Spark home: /home/thorsten/spark-hdfs
13/11/01 18:49:57 INFO WorkerWebUI: Started Worker web UI at http://base:8081
13/11/01 18:49:57 INFO Worker: Connecting to master spark://base:7077
13/11/01 18:49:58 ERROR Worker: Connection to master failed! Shutting down.

The logfile of the master:

Spark Command: /usr/lib/jvm/jdk//bin/java -cp :/home/thorsten/spark-hdfs/conf:/home/thorsten/spark-hdfs/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop2.2.0.jar -Djava.library.path= -Xms512m -Xmx512m org.apache.spark.deploy.master.Master --ip base --port 7077 --webui-port 8080
========================================

13/11/01 18:49:55 INFO Slf4jEventHandler: Slf4jEventHandler started
13/11/01 18:49:55 INFO Master: Starting Spark master at spark://base:7077
13/11/01 18:49:55 INFO MasterWebUI: Started Master web UI at http://base:8080

Hope anybody could help me, solving that problem.

Thanks
Thorsten

Re: Spark with HDFS: ERROR Worker: Connection to master failed! Shuttingdown.

Posted by Thorsten Bergler <th...@tbonline.de>.
Hello,

here is my /etc/hosts :

127.0.0.1       localhost
192.168.2.2     base

So, I don't have two IP adresses for one hostname.

I also tried out the IP address instead of the name 'base'.
Also did not help so far.

Spark Command: java -cp 
:/home/thorsten/spark-hdfs/conf:/home/thorsten/spark-hdfs/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop2.2.0.jar 
-Djava.library.path= -Xms512m -Xmx512m 
org.apache.spark.deploy.worker.Worker spark://192.168.2.2:7077 
--webui-port 8081
========================================

13/11/02 12:23:36 INFO Slf4jEventHandler: Slf4jEventHandler started
13/11/02 12:23:37 INFO Worker: Starting Spark worker base:46257 with 1 
cores, 512.0 MB RAM
13/11/02 12:23:37 INFO Worker: Spark home: /home/thorsten/spark-hdfs
13/11/02 12:23:37 INFO WorkerWebUI: Started Worker web UI at 
http://base:8081
13/11/02 12:23:37 INFO Worker: Connecting to master spark://192.168.2.2:7077
13/11/02 12:23:37 ERROR Worker: Connection to master failed! Shutting down.


The strange thing is, as I said before, I have compiled two versions of 
Spark. One version without HDFS and the other with SPARK_HADOOP_VERSION 
= 2.2.0.

The version I compiled with "sbt/sbt assembly" works without problems, 
as you can see here:

Spark Command: java -cp 
:/home/thorsten/spark/conf:/home/thorsten/spark/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop1.0.4.jar 
-Djava.library.path= -Xms512m -Xmx512m 
org.apache.spark.deploy.worker.Worker spark://192.168.2.2:7077 
--webui-port 8081
========================================

13/11/02 12:27:00 INFO Slf4jEventHandler: Slf4jEventHandler started
13/11/02 12:27:00 INFO Worker: Starting Spark worker base:45833 with 1 
cores, 512.0 MB RAM
13/11/02 12:27:00 INFO Worker: Spark home: /home/thorsten/spark
13/11/02 12:27:00 INFO WorkerWebUI: Started Worker web UI at 
http://base:8081
13/11/02 12:27:00 INFO Worker: Connecting to master spark://192.168.2.2:7077
13/11/02 12:27:00 INFO Worker: Successfully registered with master


Could it be something about Hadoop version problems?





Am 02.11.2013 02:11, schrieb dachuan:
> I have met a similar problem. My solution is: always guarantee there 
> is only one IP address for one hostname.
>
> For example, in /etc/hosts, you shouldn't let this happen:
> 127.0.1.1 base
> 168.144.8.8 base
>
>
> On Fri, Nov 1, 2013 at 9:00 PM, Chen Jingci <cjcrobin@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     Can you try to use the IP address instead of the name 'base'? I
>     also experience the same problem before, it worked after changed
>     to IP.
>
>     Thanks,
>     Chen jingci
>     ------------------------------------------------------------------------
>     From: Thorsten Bergler <ma...@tbonline.de>
>     Sent: 2/11/2013 2:03
>     To: user@spark.incubator.apache.org
>     <ma...@spark.incubator.apache.org>
>     Subject: Spark with HDFS: ERROR Worker: Connection to master
>     failed! Shuttingdown.
>
>     Hello,
>
>     I am new to Spark and doing my first steps with it today.
>
>     Right now I am having trouble with the error:
>
>     ERROR Worker: Connection to master failed! Shutting down.
>
>     So far, I found out the following:
>
>     The standalone version of Spark (without Hadoop-HDFS and YARN)
>     works perfectly. The master starts and also the worker starts and
>     get registered at the master.
>
>     Logfile of the master:
>
>     Spark Command: /usr/lib/jvm/jdk//bin/java -cp
>     :/home/thorsten/spark/conf:/home/thorsten/spark/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop1.0.4.jar
>     -Djava.library.path= -Xms512m -Xmx512m
>     org.apache.spark.deploy.master.Master --ip base --port 7077
>     --webui-port 8080
>     ========================================
>
>     13/11/01 18:51:14 INFO Slf4jEventHandler: Slf4jEventHandler started
>     13/11/01 18:51:14 INFO Master: Starting Spark master at
>     spark://base:7077
>     13/11/01 18:51:14 INFO MasterWebUI: Started Master web UI at
>     http://base:8080
>     13/11/01 18:51:17 INFO Master: Registering worker base:32942 with
>     1 cores, 512.0 MB RAM
>
>
>     Because I want to try out Spark with HDFS too, I tried to compile
>     it also that way:
>
>     |SPARK_HADOOP_VERSION=2.2.0 sbt/sbt assembly|
>
>     But with this compiled version, the registration of workers at the
>     master is not working anymore.
>
>     Here is the logfile of the slave:
>
>     Spark Command: java -cp
>     :/home/thorsten/spark-hdfs/conf:/home/thorsten/spark-hdfs/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop2.2.0.jar
>     -Djava.library.path= -Xms512m -Xmx512m
>     org.apache.spark.deploy.worker.Worker spark://base:7077
>     --webui-port 8081
>     ========================================
>
>     13/11/01 18:49:57 INFO Slf4jEventHandler: Slf4jEventHandler started
>     13/11/01 18:49:57 INFO Worker: Starting Spark worker base:37950
>     with 1 cores, 512.0 MB RAM
>     13/11/01 18:49:57 INFO Worker: Spark home: /home/thorsten/spark-hdfs
>     13/11/01 18:49:57 INFO WorkerWebUI: Started Worker web UI at
>     http://base:8081
>     13/11/01 18:49:57 INFO Worker: Connecting to master spark://base:7077
>     13/11/01 18:49:58 ERROR Worker: Connection to master failed!
>     Shutting down.
>
>     The logfile of the master:
>
>     Spark Command: /usr/lib/jvm/jdk//bin/java -cp
>     :/home/thorsten/spark-hdfs/conf:/home/thorsten/spark-hdfs/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop2.2.0.jar
>     -Djava.library.path= -Xms512m -Xmx512m
>     org.apache.spark.deploy.master.Master --ip base --port 7077
>     --webui-port 8080
>     ========================================
>
>     13/11/01 18:49:55 INFO Slf4jEventHandler: Slf4jEventHandler started
>     13/11/01 18:49:55 INFO Master: Starting Spark master at
>     spark://base:7077
>     13/11/01 18:49:55 INFO MasterWebUI: Started Master web UI at
>     http://base:8080
>
>     Hope anybody could help me, solving that problem.
>
>     Thanks
>     Thorsten
>
>
>
>
> -- 
> Dachuan Huang
> Cellphone: 614-390-7234
> 2015 Neil Avenue
> Ohio State University
> Columbus, Ohio
> U.S.A.
> 43210


Re: Spark with HDFS: ERROR Worker: Connection to master failed! Shuttingdown.

Posted by dachuan <hd...@gmail.com>.
I have met a similar problem. My solution is: always guarantee there is
only one IP address for one hostname.

For example, in /etc/hosts, you shouldn't let this happen:
127.0.1.1 base
168.144.8.8 base


On Fri, Nov 1, 2013 at 9:00 PM, Chen Jingci <cj...@gmail.com> wrote:

>  Can you try to use the IP address instead of the name 'base'? I also
> experience the same problem before, it worked after changed to IP.
>
> Thanks,
> Chen jingci
>  ------------------------------
> From: Thorsten Bergler <th...@tbonline.de>
> Sent: 2/11/2013 2:03
> To: user@spark.incubator.apache.org
> Subject: Spark with HDFS: ERROR Worker: Connection to master failed!
> Shuttingdown.
>
> Hello,
>
> I am new to Spark and doing my first steps with it today.
>
> Right now I am having trouble with the error:
>
> ERROR Worker: Connection to master failed! Shutting down.
>
> So far, I found out the following:
>
> The standalone version of Spark (without Hadoop-HDFS and YARN) works
> perfectly. The master starts and also the worker starts and get registered
> at the master.
>
> Logfile of the master:
>
> Spark Command: /usr/lib/jvm/jdk//bin/java -cp
> :/home/thorsten/spark/conf:/home/thorsten/spark/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop1.0.4.jar
> -Djava.library.path= -Xms512m -Xmx512m
> org.apache.spark.deploy.master.Master --ip base --port 7077 --webui-port
> 8080
> ========================================
>
> 13/11/01 18:51:14 INFO Slf4jEventHandler: Slf4jEventHandler started
> 13/11/01 18:51:14 INFO Master: Starting Spark master at spark://base:7077
> 13/11/01 18:51:14 INFO MasterWebUI: Started Master web UI at
> http://base:8080
> 13/11/01 18:51:17 INFO Master: Registering worker base:32942 with 1 cores,
> 512.0 MB RAM
>
>
> Because I want to try out Spark with HDFS too, I tried to compile it also
> that way:
>
> SPARK_HADOOP_VERSION=2.2.0 sbt/sbt assembly
>
> But with this compiled version, the registration of workers at the master
> is not working anymore.
>
> Here is the logfile of the slave:
>
> Spark Command: java -cp
> :/home/thorsten/spark-hdfs/conf:/home/thorsten/spark-hdfs/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop2.2.0.jar
> -Djava.library.path= -Xms512m -Xmx512m
> org.apache.spark.deploy.worker.Worker spark://base:7077 --webui-port 8081
> ========================================
>
> 13/11/01 18:49:57 INFO Slf4jEventHandler: Slf4jEventHandler started
> 13/11/01 18:49:57 INFO Worker: Starting Spark worker base:37950 with 1
> cores, 512.0 MB RAM
> 13/11/01 18:49:57 INFO Worker: Spark home: /home/thorsten/spark-hdfs
> 13/11/01 18:49:57 INFO WorkerWebUI: Started Worker web UI at
> http://base:8081
> 13/11/01 18:49:57 INFO Worker: Connecting to master spark://base:7077
> 13/11/01 18:49:58 ERROR Worker: Connection to master failed! Shutting down.
>
> The logfile of the master:
>
> Spark Command: /usr/lib/jvm/jdk//bin/java -cp
> :/home/thorsten/spark-hdfs/conf:/home/thorsten/spark-hdfs/assembly/target/scala-2.9.3/spark-assembly-0.8.0-incubating-hadoop2.2.0.jar
> -Djava.library.path= -Xms512m -Xmx512m
> org.apache.spark.deploy.master.Master --ip base --port 7077 --webui-port
> 8080
> ========================================
>
> 13/11/01 18:49:55 INFO Slf4jEventHandler: Slf4jEventHandler started
> 13/11/01 18:49:55 INFO Master: Starting Spark master at spark://base:7077
> 13/11/01 18:49:55 INFO MasterWebUI: Started Master web UI at
> http://base:8080
>
> Hope anybody could help me, solving that problem.
>
> Thanks
> Thorsten
>



-- 
Dachuan Huang
Cellphone: 614-390-7234
2015 Neil Avenue
Ohio State University
Columbus, Ohio
U.S.A.
43210