You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Pedro Sa da Costa <ps...@gmail.com> on 2014/01/09 11:35:29 UTC

org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeExceptio

When I try to launch the namenode and the datanode in MRv2, the datanode 
can't connect to the namenode, giving me the error below. I also put the 
core-site file that I use below.

The Firewall in the hosts is disabled. I don't have excluded nodes 
defined. Why the datanodes can't connect to the namenode?  Any help to 
solve this problem?


org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException): 
Datanode denied communication with namenode: DatanodeRegistrati
on(0.0.0.0, storageID=DS-1449645935-172.16.1.10-50010-1389224474955, 
infoPort=50075, ipcPort=50020, 
storageInfo=lv=-40;cid=CID-9a8571a3-17ae-49b2-b957-b009e88b9f9a;nsid=9
34416283;c=0)
         at 
org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:631)
         at 
org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3398)
         at 
org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:881)
         at 
org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
         at 
org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:18295)
         at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:454)
         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1014)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1741)
         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1737)
         at java.security.AccessController.doPrivileged(Native Method)
         at javax.security.auth.Subject.doAs(Subject.java:416)
         at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1478)
         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1735)

         at org.apache.hadoop.ipc.Client.call(Client.java:1235)
         at 
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
         at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
         at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
         at java.lang.reflect.Method.invoke(Method.java:622)
         at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
         at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
         at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
         at 
org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
         at 
org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
         at 
org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
         at 
org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
         at java.lang.Thread.run(Thread.java:701)

I set the core-site.xml

<configuration>
   <property> <name>fs.default.name <http://fs.default.name></name> 
<value>hdfs://10.103.0.17:9000 <http://10.103.0.17:9000></value> </property>
   <property> <name>hadoop.tmp.dir</name> 
<value>/tmp/hadoop-temp</value> </property>
<property><name>hadoop.proxyuser.root.hosts</name><value>*</value></property>
<property><name>hadoop.proxyuser.root.groups</name><value>*</value></property>
</configuration>

-- 
Best regards,


Re: org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeExceptio

Posted by Pedro Sa da Costa <ps...@gmail.com>.
My hdfs-site.xml and dfs.include file have the following:

172:~/Programs/hadoop-mapreduce-manager-python# cat 
../hadoop/etc/hadoop/hdfs-site.xml
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->
<configuration>
         <property> <name>dfs.replication</name> <value>1</value> 
</property>
         <property> <name>dfs.permissions</name> <value>false</value> 
</property>
         <property> <name>dfs.name.dir</name> 
<value>/tmp/data/dfs/name/</value> </property>
         <property> <name>dfs.data.dir</name> 
<value>/tmp/data/dfs/data/</value> </property>
     <property> <name>dfs.hosts</name> 
<value>/root/Programs/hadoop/etc/hadoop/dfs.include</value> </property>
</configuration>

172:~/Programs/hadoop-mapreduce-manager-python# cat 
/root/Programs/hadoop/etc/hadoop/dfs.include
172.16.XXX.XXX
172.16.XXX.XXX
172.16.XXX.XXX
172.16.XXX.XXX


But I still get the Exception. What is missing?


On 01/09/2014 11:26 AM, Harsh J wrote:
> Your hdfs-site.xml on the NN defines an "includes" file, but the
> includes file does not list this connecting DN's proper hostname/IP,
> causing the NN to reject it when it tries to ask itself to be
> registered upon startup.
>
> On Thu, Jan 9, 2014 at 4:05 PM, Pedro Sa da Costa <ps...@gmail.com> wrote:
>> When I try to launch the namenode and the datanode in MRv2, the datanode
>> can't connect to the namenode, giving me the error below. I also put the
>> core-site file that I use below.
>>
>> The Firewall in the hosts is disabled. I don't have excluded nodes defined.
>> Why the datanodes can't connect to the namenode?  Any help to solve this
>> problem?
>>
>>
>> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
>> Datanode denied communication with namenode: DatanodeRegistrati
>> on(0.0.0.0, storageID=DS-1449645935-172.16.1.10-50010-1389224474955,
>> infoPort=50075, ipcPort=50020,
>> storageInfo=lv=-40;cid=CID-9a8571a3-17ae-49b2-b957-b009e88b9f9a;nsid=9
>> 34416283;c=0)
>>          at
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:631)
>>          at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3398)
>>          at
>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:881)
>>          at
>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>          at
>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:18295)
>>          at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:454)
>>          at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1014)
>>          at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1741)
>>          at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1737)
>>          at java.security.AccessController.doPrivileged(Native Method)
>>          at javax.security.auth.Subject.doAs(Subject.java:416)
>>          at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1478)
>>          at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1735)
>>
>>          at org.apache.hadoop.ipc.Client.call(Client.java:1235)
>>          at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
>>          at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
>>          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>          at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>          at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>          at java.lang.reflect.Method.invoke(Method.java:622)
>>          at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
>>          at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
>>          at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
>>          at
>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>>          at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>>          at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>>          at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>>          at java.lang.Thread.run(Thread.java:701)
>>
>> I set the core-site.xml
>>
>> <configuration>
>>    <property> <name>fs.default.name</name>
>> <value>hdfs://10.103.0.17:9000</value> </property>
>>    <property> <name>hadoop.tmp.dir</name> <value>/tmp/hadoop-temp</value>
>> </property>
>>
>> <property><name>hadoop.proxyuser.root.hosts</name><value>*</value></property>
>>
>> <property><name>hadoop.proxyuser.root.groups</name><value>*</value></property>
>> </configuration>
>>
>> --
>> Best regards,
>
>

-- 
Best regards,


Re: org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeExceptio

Posted by Pedro Sa da Costa <ps...@gmail.com>.
My hdfs-site.xml and dfs.include file have the following:

172:~/Programs/hadoop-mapreduce-manager-python# cat 
../hadoop/etc/hadoop/hdfs-site.xml
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->
<configuration>
         <property> <name>dfs.replication</name> <value>1</value> 
</property>
         <property> <name>dfs.permissions</name> <value>false</value> 
</property>
         <property> <name>dfs.name.dir</name> 
<value>/tmp/data/dfs/name/</value> </property>
         <property> <name>dfs.data.dir</name> 
<value>/tmp/data/dfs/data/</value> </property>
     <property> <name>dfs.hosts</name> 
<value>/root/Programs/hadoop/etc/hadoop/dfs.include</value> </property>
</configuration>

172:~/Programs/hadoop-mapreduce-manager-python# cat 
/root/Programs/hadoop/etc/hadoop/dfs.include
172.16.XXX.XXX
172.16.XXX.XXX
172.16.XXX.XXX
172.16.XXX.XXX


But I still get the Exception. What is missing?


On 01/09/2014 11:26 AM, Harsh J wrote:
> Your hdfs-site.xml on the NN defines an "includes" file, but the
> includes file does not list this connecting DN's proper hostname/IP,
> causing the NN to reject it when it tries to ask itself to be
> registered upon startup.
>
> On Thu, Jan 9, 2014 at 4:05 PM, Pedro Sa da Costa <ps...@gmail.com> wrote:
>> When I try to launch the namenode and the datanode in MRv2, the datanode
>> can't connect to the namenode, giving me the error below. I also put the
>> core-site file that I use below.
>>
>> The Firewall in the hosts is disabled. I don't have excluded nodes defined.
>> Why the datanodes can't connect to the namenode?  Any help to solve this
>> problem?
>>
>>
>> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
>> Datanode denied communication with namenode: DatanodeRegistrati
>> on(0.0.0.0, storageID=DS-1449645935-172.16.1.10-50010-1389224474955,
>> infoPort=50075, ipcPort=50020,
>> storageInfo=lv=-40;cid=CID-9a8571a3-17ae-49b2-b957-b009e88b9f9a;nsid=9
>> 34416283;c=0)
>>          at
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:631)
>>          at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3398)
>>          at
>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:881)
>>          at
>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>          at
>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:18295)
>>          at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:454)
>>          at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1014)
>>          at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1741)
>>          at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1737)
>>          at java.security.AccessController.doPrivileged(Native Method)
>>          at javax.security.auth.Subject.doAs(Subject.java:416)
>>          at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1478)
>>          at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1735)
>>
>>          at org.apache.hadoop.ipc.Client.call(Client.java:1235)
>>          at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
>>          at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
>>          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>          at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>          at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>          at java.lang.reflect.Method.invoke(Method.java:622)
>>          at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
>>          at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
>>          at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
>>          at
>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>>          at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>>          at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>>          at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>>          at java.lang.Thread.run(Thread.java:701)
>>
>> I set the core-site.xml
>>
>> <configuration>
>>    <property> <name>fs.default.name</name>
>> <value>hdfs://10.103.0.17:9000</value> </property>
>>    <property> <name>hadoop.tmp.dir</name> <value>/tmp/hadoop-temp</value>
>> </property>
>>
>> <property><name>hadoop.proxyuser.root.hosts</name><value>*</value></property>
>>
>> <property><name>hadoop.proxyuser.root.groups</name><value>*</value></property>
>> </configuration>
>>
>> --
>> Best regards,
>
>

-- 
Best regards,


Re: org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeExceptio

Posted by Pedro Sa da Costa <ps...@gmail.com>.
My hdfs-site.xml and dfs.include file have the following:

172:~/Programs/hadoop-mapreduce-manager-python# cat 
../hadoop/etc/hadoop/hdfs-site.xml
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->
<configuration>
         <property> <name>dfs.replication</name> <value>1</value> 
</property>
         <property> <name>dfs.permissions</name> <value>false</value> 
</property>
         <property> <name>dfs.name.dir</name> 
<value>/tmp/data/dfs/name/</value> </property>
         <property> <name>dfs.data.dir</name> 
<value>/tmp/data/dfs/data/</value> </property>
     <property> <name>dfs.hosts</name> 
<value>/root/Programs/hadoop/etc/hadoop/dfs.include</value> </property>
</configuration>

172:~/Programs/hadoop-mapreduce-manager-python# cat 
/root/Programs/hadoop/etc/hadoop/dfs.include
172.16.XXX.XXX
172.16.XXX.XXX
172.16.XXX.XXX
172.16.XXX.XXX


But I still get the Exception. What is missing?


On 01/09/2014 11:26 AM, Harsh J wrote:
> Your hdfs-site.xml on the NN defines an "includes" file, but the
> includes file does not list this connecting DN's proper hostname/IP,
> causing the NN to reject it when it tries to ask itself to be
> registered upon startup.
>
> On Thu, Jan 9, 2014 at 4:05 PM, Pedro Sa da Costa <ps...@gmail.com> wrote:
>> When I try to launch the namenode and the datanode in MRv2, the datanode
>> can't connect to the namenode, giving me the error below. I also put the
>> core-site file that I use below.
>>
>> The Firewall in the hosts is disabled. I don't have excluded nodes defined.
>> Why the datanodes can't connect to the namenode?  Any help to solve this
>> problem?
>>
>>
>> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
>> Datanode denied communication with namenode: DatanodeRegistrati
>> on(0.0.0.0, storageID=DS-1449645935-172.16.1.10-50010-1389224474955,
>> infoPort=50075, ipcPort=50020,
>> storageInfo=lv=-40;cid=CID-9a8571a3-17ae-49b2-b957-b009e88b9f9a;nsid=9
>> 34416283;c=0)
>>          at
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:631)
>>          at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3398)
>>          at
>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:881)
>>          at
>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>          at
>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:18295)
>>          at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:454)
>>          at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1014)
>>          at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1741)
>>          at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1737)
>>          at java.security.AccessController.doPrivileged(Native Method)
>>          at javax.security.auth.Subject.doAs(Subject.java:416)
>>          at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1478)
>>          at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1735)
>>
>>          at org.apache.hadoop.ipc.Client.call(Client.java:1235)
>>          at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
>>          at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
>>          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>          at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>          at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>          at java.lang.reflect.Method.invoke(Method.java:622)
>>          at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
>>          at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
>>          at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
>>          at
>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>>          at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>>          at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>>          at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>>          at java.lang.Thread.run(Thread.java:701)
>>
>> I set the core-site.xml
>>
>> <configuration>
>>    <property> <name>fs.default.name</name>
>> <value>hdfs://10.103.0.17:9000</value> </property>
>>    <property> <name>hadoop.tmp.dir</name> <value>/tmp/hadoop-temp</value>
>> </property>
>>
>> <property><name>hadoop.proxyuser.root.hosts</name><value>*</value></property>
>>
>> <property><name>hadoop.proxyuser.root.groups</name><value>*</value></property>
>> </configuration>
>>
>> --
>> Best regards,
>
>

-- 
Best regards,


Re: org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeExceptio

Posted by Pedro Sa da Costa <ps...@gmail.com>.
My hdfs-site.xml and dfs.include file have the following:

172:~/Programs/hadoop-mapreduce-manager-python# cat 
../hadoop/etc/hadoop/hdfs-site.xml
<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<!-- Put site-specific property overrides in this file. -->
<configuration>
         <property> <name>dfs.replication</name> <value>1</value> 
</property>
         <property> <name>dfs.permissions</name> <value>false</value> 
</property>
         <property> <name>dfs.name.dir</name> 
<value>/tmp/data/dfs/name/</value> </property>
         <property> <name>dfs.data.dir</name> 
<value>/tmp/data/dfs/data/</value> </property>
     <property> <name>dfs.hosts</name> 
<value>/root/Programs/hadoop/etc/hadoop/dfs.include</value> </property>
</configuration>

172:~/Programs/hadoop-mapreduce-manager-python# cat 
/root/Programs/hadoop/etc/hadoop/dfs.include
172.16.XXX.XXX
172.16.XXX.XXX
172.16.XXX.XXX
172.16.XXX.XXX


But I still get the Exception. What is missing?


On 01/09/2014 11:26 AM, Harsh J wrote:
> Your hdfs-site.xml on the NN defines an "includes" file, but the
> includes file does not list this connecting DN's proper hostname/IP,
> causing the NN to reject it when it tries to ask itself to be
> registered upon startup.
>
> On Thu, Jan 9, 2014 at 4:05 PM, Pedro Sa da Costa <ps...@gmail.com> wrote:
>> When I try to launch the namenode and the datanode in MRv2, the datanode
>> can't connect to the namenode, giving me the error below. I also put the
>> core-site file that I use below.
>>
>> The Firewall in the hosts is disabled. I don't have excluded nodes defined.
>> Why the datanodes can't connect to the namenode?  Any help to solve this
>> problem?
>>
>>
>> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
>> Datanode denied communication with namenode: DatanodeRegistrati
>> on(0.0.0.0, storageID=DS-1449645935-172.16.1.10-50010-1389224474955,
>> infoPort=50075, ipcPort=50020,
>> storageInfo=lv=-40;cid=CID-9a8571a3-17ae-49b2-b957-b009e88b9f9a;nsid=9
>> 34416283;c=0)
>>          at
>> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:631)
>>          at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3398)
>>          at
>> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:881)
>>          at
>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>>          at
>> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:18295)
>>          at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:454)
>>          at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1014)
>>          at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1741)
>>          at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1737)
>>          at java.security.AccessController.doPrivileged(Native Method)
>>          at javax.security.auth.Subject.doAs(Subject.java:416)
>>          at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1478)
>>          at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1735)
>>
>>          at org.apache.hadoop.ipc.Client.call(Client.java:1235)
>>          at
>> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
>>          at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
>>          at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>          at
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>          at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>          at java.lang.reflect.Method.invoke(Method.java:622)
>>          at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
>>          at
>> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
>>          at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
>>          at
>> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>>          at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>>          at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>>          at
>> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>>          at java.lang.Thread.run(Thread.java:701)
>>
>> I set the core-site.xml
>>
>> <configuration>
>>    <property> <name>fs.default.name</name>
>> <value>hdfs://10.103.0.17:9000</value> </property>
>>    <property> <name>hadoop.tmp.dir</name> <value>/tmp/hadoop-temp</value>
>> </property>
>>
>> <property><name>hadoop.proxyuser.root.hosts</name><value>*</value></property>
>>
>> <property><name>hadoop.proxyuser.root.groups</name><value>*</value></property>
>> </configuration>
>>
>> --
>> Best regards,
>
>

-- 
Best regards,


Re: org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeExceptio

Posted by Harsh J <ha...@cloudera.com>.
Your hdfs-site.xml on the NN defines an "includes" file, but the
includes file does not list this connecting DN's proper hostname/IP,
causing the NN to reject it when it tries to ask itself to be
registered upon startup.

On Thu, Jan 9, 2014 at 4:05 PM, Pedro Sa da Costa <ps...@gmail.com> wrote:
>
> When I try to launch the namenode and the datanode in MRv2, the datanode
> can't connect to the namenode, giving me the error below. I also put the
> core-site file that I use below.
>
> The Firewall in the hosts is disabled. I don't have excluded nodes defined.
> Why the datanodes can't connect to the namenode?  Any help to solve this
> problem?
>
>
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
> Datanode denied communication with namenode: DatanodeRegistrati
> on(0.0.0.0, storageID=DS-1449645935-172.16.1.10-50010-1389224474955,
> infoPort=50075, ipcPort=50020,
> storageInfo=lv=-40;cid=CID-9a8571a3-17ae-49b2-b957-b009e88b9f9a;nsid=9
> 34416283;c=0)
>         at
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:631)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3398)
>         at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:881)
>         at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>         at
> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:18295)
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:454)
>         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1014)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1741)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1737)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:416)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1478)
>         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1735)
>
>         at org.apache.hadoop.ipc.Client.call(Client.java:1235)
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
>         at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:622)
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
>         at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
>         at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>         at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>         at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>         at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>         at java.lang.Thread.run(Thread.java:701)
>
> I set the core-site.xml
>
> <configuration>
>   <property> <name>fs.default.name</name>
> <value>hdfs://10.103.0.17:9000</value> </property>
>   <property> <name>hadoop.tmp.dir</name> <value>/tmp/hadoop-temp</value>
> </property>
>
> <property><name>hadoop.proxyuser.root.hosts</name><value>*</value></property>
>
> <property><name>hadoop.proxyuser.root.groups</name><value>*</value></property>
> </configuration>
>
> --
> Best regards,



-- 
Harsh J

Re: org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeExceptio

Posted by Harsh J <ha...@cloudera.com>.
Your hdfs-site.xml on the NN defines an "includes" file, but the
includes file does not list this connecting DN's proper hostname/IP,
causing the NN to reject it when it tries to ask itself to be
registered upon startup.

On Thu, Jan 9, 2014 at 4:05 PM, Pedro Sa da Costa <ps...@gmail.com> wrote:
>
> When I try to launch the namenode and the datanode in MRv2, the datanode
> can't connect to the namenode, giving me the error below. I also put the
> core-site file that I use below.
>
> The Firewall in the hosts is disabled. I don't have excluded nodes defined.
> Why the datanodes can't connect to the namenode?  Any help to solve this
> problem?
>
>
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
> Datanode denied communication with namenode: DatanodeRegistrati
> on(0.0.0.0, storageID=DS-1449645935-172.16.1.10-50010-1389224474955,
> infoPort=50075, ipcPort=50020,
> storageInfo=lv=-40;cid=CID-9a8571a3-17ae-49b2-b957-b009e88b9f9a;nsid=9
> 34416283;c=0)
>         at
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:631)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3398)
>         at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:881)
>         at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>         at
> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:18295)
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:454)
>         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1014)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1741)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1737)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:416)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1478)
>         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1735)
>
>         at org.apache.hadoop.ipc.Client.call(Client.java:1235)
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
>         at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:622)
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
>         at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
>         at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>         at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>         at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>         at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>         at java.lang.Thread.run(Thread.java:701)
>
> I set the core-site.xml
>
> <configuration>
>   <property> <name>fs.default.name</name>
> <value>hdfs://10.103.0.17:9000</value> </property>
>   <property> <name>hadoop.tmp.dir</name> <value>/tmp/hadoop-temp</value>
> </property>
>
> <property><name>hadoop.proxyuser.root.hosts</name><value>*</value></property>
>
> <property><name>hadoop.proxyuser.root.groups</name><value>*</value></property>
> </configuration>
>
> --
> Best regards,



-- 
Harsh J

Re: org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeExceptio

Posted by Harsh J <ha...@cloudera.com>.
Your hdfs-site.xml on the NN defines an "includes" file, but the
includes file does not list this connecting DN's proper hostname/IP,
causing the NN to reject it when it tries to ask itself to be
registered upon startup.

On Thu, Jan 9, 2014 at 4:05 PM, Pedro Sa da Costa <ps...@gmail.com> wrote:
>
> When I try to launch the namenode and the datanode in MRv2, the datanode
> can't connect to the namenode, giving me the error below. I also put the
> core-site file that I use below.
>
> The Firewall in the hosts is disabled. I don't have excluded nodes defined.
> Why the datanodes can't connect to the namenode?  Any help to solve this
> problem?
>
>
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
> Datanode denied communication with namenode: DatanodeRegistrati
> on(0.0.0.0, storageID=DS-1449645935-172.16.1.10-50010-1389224474955,
> infoPort=50075, ipcPort=50020,
> storageInfo=lv=-40;cid=CID-9a8571a3-17ae-49b2-b957-b009e88b9f9a;nsid=9
> 34416283;c=0)
>         at
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:631)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3398)
>         at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:881)
>         at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>         at
> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:18295)
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:454)
>         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1014)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1741)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1737)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:416)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1478)
>         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1735)
>
>         at org.apache.hadoop.ipc.Client.call(Client.java:1235)
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
>         at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:622)
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
>         at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
>         at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>         at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>         at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>         at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>         at java.lang.Thread.run(Thread.java:701)
>
> I set the core-site.xml
>
> <configuration>
>   <property> <name>fs.default.name</name>
> <value>hdfs://10.103.0.17:9000</value> </property>
>   <property> <name>hadoop.tmp.dir</name> <value>/tmp/hadoop-temp</value>
> </property>
>
> <property><name>hadoop.proxyuser.root.hosts</name><value>*</value></property>
>
> <property><name>hadoop.proxyuser.root.groups</name><value>*</value></property>
> </configuration>
>
> --
> Best regards,



-- 
Harsh J

Re: org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeExceptio

Posted by Harsh J <ha...@cloudera.com>.
Your hdfs-site.xml on the NN defines an "includes" file, but the
includes file does not list this connecting DN's proper hostname/IP,
causing the NN to reject it when it tries to ask itself to be
registered upon startup.

On Thu, Jan 9, 2014 at 4:05 PM, Pedro Sa da Costa <ps...@gmail.com> wrote:
>
> When I try to launch the namenode and the datanode in MRv2, the datanode
> can't connect to the namenode, giving me the error below. I also put the
> core-site file that I use below.
>
> The Firewall in the hosts is disabled. I don't have excluded nodes defined.
> Why the datanodes can't connect to the namenode?  Any help to solve this
> problem?
>
>
> org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.protocol.DisallowedDatanodeException):
> Datanode denied communication with namenode: DatanodeRegistrati
> on(0.0.0.0, storageID=DS-1449645935-172.16.1.10-50010-1389224474955,
> infoPort=50075, ipcPort=50020,
> storageInfo=lv=-40;cid=CID-9a8571a3-17ae-49b2-b957-b009e88b9f9a;nsid=9
> 34416283;c=0)
>         at
> org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager.registerDatanode(DatanodeManager.java:631)
>         at
> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.registerDatanode(FSNamesystem.java:3398)
>         at
> org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.registerDatanode(NameNodeRpcServer.java:881)
>         at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolServerSideTranslatorPB.registerDatanode(DatanodeProtocolServerSideTranslatorPB.java:90)
>         at
> org.apache.hadoop.hdfs.protocol.proto.DatanodeProtocolProtos$DatanodeProtocolService$2.callBlockingMethod(DatanodeProtocolProtos.java:18295)
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:454)
>         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1014)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1741)
>         at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1737)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at javax.security.auth.Subject.doAs(Subject.java:416)
>         at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1478)
>         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1735)
>
>         at org.apache.hadoop.ipc.Client.call(Client.java:1235)
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
>         at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:622)
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
>         at com.sun.proxy.$Proxy9.registerDatanode(Unknown Source)
>         at
> org.apache.hadoop.hdfs.protocolPB.DatanodeProtocolClientSideTranslatorPB.registerDatanode(DatanodeProtocolClientSideTranslatorPB.java:146)
>         at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.register(BPServiceActor.java:623)
>         at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:225)
>         at
> org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:664)
>         at java.lang.Thread.run(Thread.java:701)
>
> I set the core-site.xml
>
> <configuration>
>   <property> <name>fs.default.name</name>
> <value>hdfs://10.103.0.17:9000</value> </property>
>   <property> <name>hadoop.tmp.dir</name> <value>/tmp/hadoop-temp</value>
> </property>
>
> <property><name>hadoop.proxyuser.root.hosts</name><value>*</value></property>
>
> <property><name>hadoop.proxyuser.root.groups</name><value>*</value></property>
> </configuration>
>
> --
> Best regards,



-- 
Harsh J