You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-dev@hadoop.apache.org by 周辉 <zh...@gmail.com> on 2009/01/03 07:44:58 UTC

I can run hadoop,

hi:
  I want to run hadoop,but there is a error,can you help me?
  The log is :

  2009-01-03 14:10:52,109 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting DataNode
STARTUP_MSG:   host = f2/192.168.1.102
STARTUP_MSG:   args = []
STARTUP_MSG:   version = 0.19.0
STARTUP_MSG:   build =
https://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.19 -r 713890;
compiled by 'ndaley' on Fri Nov 14 03:12:29 UTC 2008
************************************************************/
2009-01-03 14:10:54,963 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: /192.168.1.55:9000. Already tried 0 time(s).
2009-01-03 14:10:55,965 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: /192.168.1.55:9000. Already tried 1 time(s).
2009-01-03 14:10:56,969 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: /192.168.1.55:9000. Already tried 2 time(s).
2009-01-03 14:10:57,972 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: /192.168.1.55:9000. Already tried 3 time(s).
2009-01-03 14:10:58,974 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: /192.168.1.55:9000. Already tried 4 time(s).
2009-01-03 14:10:59,976 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: /192.168.1.55:9000. Already tried 5 time(s).
2009-01-03 14:11:00,978 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: /192.168.1.55:9000. Already tried 6 time(s).
2009-01-03 14:11:01,981 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: /192.168.1.55:9000. Already tried 7 time(s).
2009-01-03 14:11:02,986 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: /192.168.1.55:9000. Already tried 8 time(s).
2009-01-03 14:11:04,002 INFO org.apache.hadoop.ipc.Client: Retrying connect
to server: /192.168.1.55:9000. Already tried 9 time(s).
2009-01-03 14:11:04,061 ERROR
org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Call
to /192.168.1.55:9000 failed on local exception: No route to host
    at org.apache.hadoop.ipc.Client.call(Client.java:699)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216)
    at $Proxy4.getProtocolVersion(Unknown Source)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:319)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:306)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:343)
    at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:288)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:258)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:205)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1199)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1154)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1162)
    at
org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1284)
Caused by: java.net.NoRouteToHostException: No route to host
    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
    at
sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:574)
    at sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:100)
    at
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:299)
    at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
    at org.apache.hadoop.ipc.Client.getConnection(Client.java:772)
    at org.apache.hadoop.ipc.Client.call(Client.java:685)
    ... 12 more

2009-01-03 14:11:04,072 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
/************************************************************
SHUTDOWN_MSG: Shutting down DataNode at f2/192.168.1.102
************************************************************/

Re: I can run hadoop,

Posted by Aaron Kimball <aa...@cloudera.com>.
org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Call
to /192.168.1.55:9000 failed on local exception: No route to host

It sounds like 192.168.1.102 cannot see the master at 192.168.1.55. Can you
ping or ssh from .102 to .55?

- Aaron


On Fri, Jan 2, 2009 at 10:44 PM, 周辉 <zh...@gmail.com> wrote:

> hi:
>  I want to run hadoop,but there is a error,can you help me?
>  The log is :
>
>  2009-01-03 14:10:52,109 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting DataNode
> STARTUP_MSG:   host = f2/192.168.1.102
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 0.19.0
> STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.19 -r
> 713890;
> compiled by 'ndaley' on Fri Nov 14 03:12:29 UTC 2008
> ************************************************************/
> 2009-01-03 14:10:54,963 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: /192.168.1.55:9000. Already tried 0 time(s).
> 2009-01-03 14:10:55,965 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: /192.168.1.55:9000. Already tried 1 time(s).
> 2009-01-03 14:10:56,969 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: /192.168.1.55:9000. Already tried 2 time(s).
> 2009-01-03 14:10:57,972 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: /192.168.1.55:9000. Already tried 3 time(s).
> 2009-01-03 14:10:58,974 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: /192.168.1.55:9000. Already tried 4 time(s).
> 2009-01-03 14:10:59,976 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: /192.168.1.55:9000. Already tried 5 time(s).
> 2009-01-03 14:11:00,978 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: /192.168.1.55:9000. Already tried 6 time(s).
> 2009-01-03 14:11:01,981 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: /192.168.1.55:9000. Already tried 7 time(s).
> 2009-01-03 14:11:02,986 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: /192.168.1.55:9000. Already tried 8 time(s).
> 2009-01-03 14:11:04,002 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: /192.168.1.55:9000. Already tried 9 time(s).
> 2009-01-03 14:11:04,061 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Call
> to /192.168.1.55:9000 failed on local exception: No route to host
>    at org.apache.hadoop.ipc.Client.call(Client.java:699)
>    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216)
>    at $Proxy4.getProtocolVersion(Unknown Source)
>    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:319)
>    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:306)
>    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:343)
>    at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:288)
>    at
>
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:258)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:205)
>    at
>
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1199)
>    at
>
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1154)
>    at
>
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1162)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1284)
> Caused by: java.net.NoRouteToHostException: No route to host
>    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>    at
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:574)
>    at sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:100)
>    at
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:299)
>    at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
>    at org.apache.hadoop.ipc.Client.getConnection(Client.java:772)
>    at org.apache.hadoop.ipc.Client.call(Client.java:685)
>    ... 12 more
>
> 2009-01-03 14:11:04,072 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down DataNode at f2/192.168.1.102
> ************************************************************/
>

Re: I can run hadoop,

Posted by tienduc_dinh <ti...@yahoo.com>.
I think, you should start the services. Try start-dfs.sh and start-mapred.sh


周辉 wrote:
> 
> hi:
>   I want to run hadoop,but there is a error,can you help me?
>   The log is :
> 
>   2009-01-03 14:10:52,109 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting DataNode
> STARTUP_MSG:   host = f2/192.168.1.102
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 0.19.0
> STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.19 -r
> 713890;
> compiled by 'ndaley' on Fri Nov 14 03:12:29 UTC 2008
> ************************************************************/
> 2009-01-03 14:10:54,963 INFO org.apache.hadoop.ipc.Client: Retrying
> connect
> to server: /192.168.1.55:9000. Already tried 0 time(s).
> 2009-01-03 14:10:55,965 INFO org.apache.hadoop.ipc.Client: Retrying
> connect
> to server: /192.168.1.55:9000. Already tried 1 time(s).
> 2009-01-03 14:10:56,969 INFO org.apache.hadoop.ipc.Client: Retrying
> connect
> to server: /192.168.1.55:9000. Already tried 2 time(s).
> 2009-01-03 14:10:57,972 INFO org.apache.hadoop.ipc.Client: Retrying
> connect
> to server: /192.168.1.55:9000. Already tried 3 time(s).
> 2009-01-03 14:10:58,974 INFO org.apache.hadoop.ipc.Client: Retrying
> connect
> to server: /192.168.1.55:9000. Already tried 4 time(s).
> 2009-01-03 14:10:59,976 INFO org.apache.hadoop.ipc.Client: Retrying
> connect
> to server: /192.168.1.55:9000. Already tried 5 time(s).
> 2009-01-03 14:11:00,978 INFO org.apache.hadoop.ipc.Client: Retrying
> connect
> to server: /192.168.1.55:9000. Already tried 6 time(s).
> 2009-01-03 14:11:01,981 INFO org.apache.hadoop.ipc.Client: Retrying
> connect
> to server: /192.168.1.55:9000. Already tried 7 time(s).
> 2009-01-03 14:11:02,986 INFO org.apache.hadoop.ipc.Client: Retrying
> connect
> to server: /192.168.1.55:9000. Already tried 8 time(s).
> 2009-01-03 14:11:04,002 INFO org.apache.hadoop.ipc.Client: Retrying
> connect
> to server: /192.168.1.55:9000. Already tried 9 time(s).
> 2009-01-03 14:11:04,061 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Call
> to /192.168.1.55:9000 failed on local exception: No route to host
>     at org.apache.hadoop.ipc.Client.call(Client.java:699)
>     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216)
>     at $Proxy4.getProtocolVersion(Unknown Source)
>     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:319)
>     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:306)
>     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:343)
>     at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:288)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:258)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:205)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1199)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1154)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1162)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1284)
> Caused by: java.net.NoRouteToHostException: No route to host
>     at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>     at
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:574)
>     at sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:100)
>     at
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:299)
>     at
> org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
>     at org.apache.hadoop.ipc.Client.getConnection(Client.java:772)
>     at org.apache.hadoop.ipc.Client.call(Client.java:685)
>     ... 12 more
> 
> 2009-01-03 14:11:04,072 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down DataNode at f2/192.168.1.102
> ************************************************************/
> 
> 

-- 
View this message in context: http://www.nabble.com/I-can-run-hadoop%2C-tp21263335p21320109.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.


Re: I can run hadoop,

Posted by Aaron Kimball <aa...@cloudera.com>.
org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Call
to /192.168.1.55:9000 failed on local exception: No route to host

It sounds like 192.168.1.102 cannot see the master at 192.168.1.55. Can you
ping or ssh from .102 to .55?

- Aaron


On Fri, Jan 2, 2009 at 10:44 PM, 周辉 <zh...@gmail.com> wrote:

> hi:
>  I want to run hadoop,but there is a error,can you help me?
>  The log is :
>
>  2009-01-03 14:10:52,109 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting DataNode
> STARTUP_MSG:   host = f2/192.168.1.102
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 0.19.0
> STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.19 -r
> 713890;
> compiled by 'ndaley' on Fri Nov 14 03:12:29 UTC 2008
> ************************************************************/
> 2009-01-03 14:10:54,963 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: /192.168.1.55:9000. Already tried 0 time(s).
> 2009-01-03 14:10:55,965 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: /192.168.1.55:9000. Already tried 1 time(s).
> 2009-01-03 14:10:56,969 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: /192.168.1.55:9000. Already tried 2 time(s).
> 2009-01-03 14:10:57,972 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: /192.168.1.55:9000. Already tried 3 time(s).
> 2009-01-03 14:10:58,974 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: /192.168.1.55:9000. Already tried 4 time(s).
> 2009-01-03 14:10:59,976 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: /192.168.1.55:9000. Already tried 5 time(s).
> 2009-01-03 14:11:00,978 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: /192.168.1.55:9000. Already tried 6 time(s).
> 2009-01-03 14:11:01,981 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: /192.168.1.55:9000. Already tried 7 time(s).
> 2009-01-03 14:11:02,986 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: /192.168.1.55:9000. Already tried 8 time(s).
> 2009-01-03 14:11:04,002 INFO org.apache.hadoop.ipc.Client: Retrying connect
> to server: /192.168.1.55:9000. Already tried 9 time(s).
> 2009-01-03 14:11:04,061 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Call
> to /192.168.1.55:9000 failed on local exception: No route to host
>    at org.apache.hadoop.ipc.Client.call(Client.java:699)
>    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216)
>    at $Proxy4.getProtocolVersion(Unknown Source)
>    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:319)
>    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:306)
>    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:343)
>    at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:288)
>    at
>
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:258)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:205)
>    at
>
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1199)
>    at
>
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1154)
>    at
>
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1162)
>    at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1284)
> Caused by: java.net.NoRouteToHostException: No route to host
>    at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>    at
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:574)
>    at sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:100)
>    at
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:299)
>    at org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
>    at org.apache.hadoop.ipc.Client.getConnection(Client.java:772)
>    at org.apache.hadoop.ipc.Client.call(Client.java:685)
>    ... 12 more
>
> 2009-01-03 14:11:04,072 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down DataNode at f2/192.168.1.102
> ************************************************************/
>

Re: I can run hadoop,

Posted by tienduc_dinh <ti...@yahoo.com>.
I think, you should start the services. Try start-dfs.sh and start-mapred.sh


周辉 wrote:
> 
> hi:
>   I want to run hadoop,but there is a error,can you help me?
>   The log is :
> 
>   2009-01-03 14:10:52,109 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: STARTUP_MSG:
> /************************************************************
> STARTUP_MSG: Starting DataNode
> STARTUP_MSG:   host = f2/192.168.1.102
> STARTUP_MSG:   args = []
> STARTUP_MSG:   version = 0.19.0
> STARTUP_MSG:   build =
> https://svn.apache.org/repos/asf/hadoop/core/branches/branch-0.19 -r
> 713890;
> compiled by 'ndaley' on Fri Nov 14 03:12:29 UTC 2008
> ************************************************************/
> 2009-01-03 14:10:54,963 INFO org.apache.hadoop.ipc.Client: Retrying
> connect
> to server: /192.168.1.55:9000. Already tried 0 time(s).
> 2009-01-03 14:10:55,965 INFO org.apache.hadoop.ipc.Client: Retrying
> connect
> to server: /192.168.1.55:9000. Already tried 1 time(s).
> 2009-01-03 14:10:56,969 INFO org.apache.hadoop.ipc.Client: Retrying
> connect
> to server: /192.168.1.55:9000. Already tried 2 time(s).
> 2009-01-03 14:10:57,972 INFO org.apache.hadoop.ipc.Client: Retrying
> connect
> to server: /192.168.1.55:9000. Already tried 3 time(s).
> 2009-01-03 14:10:58,974 INFO org.apache.hadoop.ipc.Client: Retrying
> connect
> to server: /192.168.1.55:9000. Already tried 4 time(s).
> 2009-01-03 14:10:59,976 INFO org.apache.hadoop.ipc.Client: Retrying
> connect
> to server: /192.168.1.55:9000. Already tried 5 time(s).
> 2009-01-03 14:11:00,978 INFO org.apache.hadoop.ipc.Client: Retrying
> connect
> to server: /192.168.1.55:9000. Already tried 6 time(s).
> 2009-01-03 14:11:01,981 INFO org.apache.hadoop.ipc.Client: Retrying
> connect
> to server: /192.168.1.55:9000. Already tried 7 time(s).
> 2009-01-03 14:11:02,986 INFO org.apache.hadoop.ipc.Client: Retrying
> connect
> to server: /192.168.1.55:9000. Already tried 8 time(s).
> 2009-01-03 14:11:04,002 INFO org.apache.hadoop.ipc.Client: Retrying
> connect
> to server: /192.168.1.55:9000. Already tried 9 time(s).
> 2009-01-03 14:11:04,061 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: java.io.IOException: Call
> to /192.168.1.55:9000 failed on local exception: No route to host
>     at org.apache.hadoop.ipc.Client.call(Client.java:699)
>     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:216)
>     at $Proxy4.getProtocolVersion(Unknown Source)
>     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:319)
>     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:306)
>     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:343)
>     at org.apache.hadoop.ipc.RPC.waitForProxy(RPC.java:288)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:258)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:205)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:1199)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:1154)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:1162)
>     at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:1284)
> Caused by: java.net.NoRouteToHostException: No route to host
>     at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>     at
> sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:574)
>     at sun.nio.ch.SocketAdaptor.connect(SocketAdaptor.java:100)
>     at
> org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:299)
>     at
> org.apache.hadoop.ipc.Client$Connection.access$1700(Client.java:176)
>     at org.apache.hadoop.ipc.Client.getConnection(Client.java:772)
>     at org.apache.hadoop.ipc.Client.call(Client.java:685)
>     ... 12 more
> 
> 2009-01-03 14:11:04,072 INFO
> org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
> /************************************************************
> SHUTDOWN_MSG: Shutting down DataNode at f2/192.168.1.102
> ************************************************************/
> 
> 

-- 
View this message in context: http://www.nabble.com/I-can-run-hadoop%2C-tp21263335p21320091.html
Sent from the Hadoop core-user mailing list archive at Nabble.com.