You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Satyam Singh <sa...@ericsson.com> on 2014/03/13 08:59:13 UTC

Reading files from hdfs directory

Hello,

I want to read files from hdfs remotely through camel-hdfs client.
I have made changes in camel-hdfs component for supporting hadoop2.2.0 .

I checked file that I want to read, exists on hdfs:

[hduser@bl460cx2425 ~]$ hadoop fs -ls /user/hduser/collector/test.txt
14/03/13 09:13:31 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 1 items
-rw-r--r--   3 root supergroup       1886 2014-03-13 09:13 /user/hduser/collector/test.txt


But, I get following exception when I am trying to read it through my client from remote machine :

2014-03-13 14:08:25 STDIO [ERROR] Caused by: org.apache.hadoop.ipc.RemoteException(java.io.FileNotFoundException): File does not exist: /user/hduser/collector/test.txt
        at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61)
        at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:51)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsUpdateTimes(FSNamesystem.java:1499)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1448)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1428)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1402)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:468)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:269)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:59566)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Unknown Source)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.ipc.Client.call(Client.java:1347)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.ipc.Client.call(Client.java:1300)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
2014-03-13 14:08:25 STDIO [ERROR] at com.sun.proxy.$Proxy17.getBlockLocations(Unknown Source)
2014-03-13 14:08:25 STDIO [ERROR] at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)
2014-03-13 14:08:25 STDIO [ERROR] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2014-03-13 14:08:25 STDIO [ERROR] at java.lang.reflect.Method.invoke(Method.java:606)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
2014-03-13 14:08:25 STDIO [ERROR] at com.sun.proxy.$Proxy17.getBlockLocations(Unknown Source)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:188)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1064)
2014-03-13 14:08:25 STDIO [ERROR] ... 24 more


In my client I have ftp files from remote ftp server and put it in hdfs system in path: /user/hduser/collector. Then we send this file name to our hdfs file reading client and gives above exception.


Prompt help is really appreciated :)

BR,
Satyam






RE: Reading files from hdfs directory

Posted by Vinayakumar B <vi...@huawei.com>.
Hi Satyam,

Check whether your Camel client-side configurations are pointing to correct NameNode(s).

What is the deployment ? whether HA/Non-HA?

And check whether same exception is present in (Active) NameNode logs. If not then request is going to some other  NameNode.

Regards,
Vinayakumar B.
From: Satyam Singh [mailto:satyam.singh@ericsson.com]
Sent: 13 March 2014 13:29
To: user@hadoop.apache.org
Subject: Reading files from hdfs directory

Hello,

I want to read files from hdfs remotely through camel-hdfs client.
I have made changes in camel-hdfs component for supporting hadoop2.2.0 .

I checked file that I want to read, exists on hdfs:

[hduser@bl460cx2425 ~]$ hadoop fs -ls /user/hduser/collector/test.txt
14/03/13 09:13:31 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 1 items
-rw-r--r--   3 root supergroup       1886 2014-03-13 09:13 /user/hduser/collector/test.txt


But, I get following exception when I am trying to read it through my client from remote machine :

2014-03-13 14:08:25 STDIO [ERROR] Caused by: org.apache.hadoop.ipc.RemoteException(java.io.FileNotFoundException): File does not exist: /user/hduser/collector/test.txt
        at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61)
        at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:51)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsUpdateTimes(FSNamesystem.java:1499)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1448)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1428)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1402)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:468)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:269)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:59566)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Unknown Source)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.ipc.Client.call(Client.java:1347)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.ipc.Client.call(Client.java:1300)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
2014-03-13 14:08:25 STDIO [ERROR] at com.sun.proxy.$Proxy17.getBlockLocations(Unknown Source)
2014-03-13 14:08:25 STDIO [ERROR] at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)
2014-03-13 14:08:25 STDIO [ERROR] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2014-03-13 14:08:25 STDIO [ERROR] at java.lang.reflect.Method.invoke(Method.java:606)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
2014-03-13 14:08:25 STDIO [ERROR] at com.sun.proxy.$Proxy17.getBlockLocations(Unknown Source)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:188)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1064)
2014-03-13 14:08:25 STDIO [ERROR] ... 24 more


In my client I have ftp files from remote ftp server and put it in hdfs system in path: /user/hduser/collector. Then we send this file name to our hdfs file reading client and gives above exception.


Prompt help is really appreciated :)

BR,
Satyam






RE: Reading files from hdfs directory

Posted by Vinayakumar B <vi...@huawei.com>.
Hi Satyam,

Check whether your Camel client-side configurations are pointing to correct NameNode(s).

What is the deployment ? whether HA/Non-HA?

And check whether same exception is present in (Active) NameNode logs. If not then request is going to some other  NameNode.

Regards,
Vinayakumar B.
From: Satyam Singh [mailto:satyam.singh@ericsson.com]
Sent: 13 March 2014 13:29
To: user@hadoop.apache.org
Subject: Reading files from hdfs directory

Hello,

I want to read files from hdfs remotely through camel-hdfs client.
I have made changes in camel-hdfs component for supporting hadoop2.2.0 .

I checked file that I want to read, exists on hdfs:

[hduser@bl460cx2425 ~]$ hadoop fs -ls /user/hduser/collector/test.txt
14/03/13 09:13:31 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 1 items
-rw-r--r--   3 root supergroup       1886 2014-03-13 09:13 /user/hduser/collector/test.txt


But, I get following exception when I am trying to read it through my client from remote machine :

2014-03-13 14:08:25 STDIO [ERROR] Caused by: org.apache.hadoop.ipc.RemoteException(java.io.FileNotFoundException): File does not exist: /user/hduser/collector/test.txt
        at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61)
        at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:51)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsUpdateTimes(FSNamesystem.java:1499)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1448)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1428)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1402)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:468)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:269)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:59566)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Unknown Source)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.ipc.Client.call(Client.java:1347)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.ipc.Client.call(Client.java:1300)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
2014-03-13 14:08:25 STDIO [ERROR] at com.sun.proxy.$Proxy17.getBlockLocations(Unknown Source)
2014-03-13 14:08:25 STDIO [ERROR] at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)
2014-03-13 14:08:25 STDIO [ERROR] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2014-03-13 14:08:25 STDIO [ERROR] at java.lang.reflect.Method.invoke(Method.java:606)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
2014-03-13 14:08:25 STDIO [ERROR] at com.sun.proxy.$Proxy17.getBlockLocations(Unknown Source)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:188)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1064)
2014-03-13 14:08:25 STDIO [ERROR] ... 24 more


In my client I have ftp files from remote ftp server and put it in hdfs system in path: /user/hduser/collector. Then we send this file name to our hdfs file reading client and gives above exception.


Prompt help is really appreciated :)

BR,
Satyam






RE: Reading files from hdfs directory

Posted by Vinayakumar B <vi...@huawei.com>.
Hi Satyam,

Check whether your Camel client-side configurations are pointing to correct NameNode(s).

What is the deployment ? whether HA/Non-HA?

And check whether same exception is present in (Active) NameNode logs. If not then request is going to some other  NameNode.

Regards,
Vinayakumar B.
From: Satyam Singh [mailto:satyam.singh@ericsson.com]
Sent: 13 March 2014 13:29
To: user@hadoop.apache.org
Subject: Reading files from hdfs directory

Hello,

I want to read files from hdfs remotely through camel-hdfs client.
I have made changes in camel-hdfs component for supporting hadoop2.2.0 .

I checked file that I want to read, exists on hdfs:

[hduser@bl460cx2425 ~]$ hadoop fs -ls /user/hduser/collector/test.txt
14/03/13 09:13:31 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 1 items
-rw-r--r--   3 root supergroup       1886 2014-03-13 09:13 /user/hduser/collector/test.txt


But, I get following exception when I am trying to read it through my client from remote machine :

2014-03-13 14:08:25 STDIO [ERROR] Caused by: org.apache.hadoop.ipc.RemoteException(java.io.FileNotFoundException): File does not exist: /user/hduser/collector/test.txt
        at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61)
        at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:51)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsUpdateTimes(FSNamesystem.java:1499)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1448)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1428)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1402)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:468)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:269)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:59566)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Unknown Source)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.ipc.Client.call(Client.java:1347)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.ipc.Client.call(Client.java:1300)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
2014-03-13 14:08:25 STDIO [ERROR] at com.sun.proxy.$Proxy17.getBlockLocations(Unknown Source)
2014-03-13 14:08:25 STDIO [ERROR] at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)
2014-03-13 14:08:25 STDIO [ERROR] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2014-03-13 14:08:25 STDIO [ERROR] at java.lang.reflect.Method.invoke(Method.java:606)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
2014-03-13 14:08:25 STDIO [ERROR] at com.sun.proxy.$Proxy17.getBlockLocations(Unknown Source)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:188)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1064)
2014-03-13 14:08:25 STDIO [ERROR] ... 24 more


In my client I have ftp files from remote ftp server and put it in hdfs system in path: /user/hduser/collector. Then we send this file name to our hdfs file reading client and gives above exception.


Prompt help is really appreciated :)

BR,
Satyam






RE: Reading files from hdfs directory

Posted by Vinayakumar B <vi...@huawei.com>.
Hi Satyam,

Check whether your Camel client-side configurations are pointing to correct NameNode(s).

What is the deployment ? whether HA/Non-HA?

And check whether same exception is present in (Active) NameNode logs. If not then request is going to some other  NameNode.

Regards,
Vinayakumar B.
From: Satyam Singh [mailto:satyam.singh@ericsson.com]
Sent: 13 March 2014 13:29
To: user@hadoop.apache.org
Subject: Reading files from hdfs directory

Hello,

I want to read files from hdfs remotely through camel-hdfs client.
I have made changes in camel-hdfs component for supporting hadoop2.2.0 .

I checked file that I want to read, exists on hdfs:

[hduser@bl460cx2425 ~]$ hadoop fs -ls /user/hduser/collector/test.txt
14/03/13 09:13:31 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 1 items
-rw-r--r--   3 root supergroup       1886 2014-03-13 09:13 /user/hduser/collector/test.txt


But, I get following exception when I am trying to read it through my client from remote machine :

2014-03-13 14:08:25 STDIO [ERROR] Caused by: org.apache.hadoop.ipc.RemoteException(java.io.FileNotFoundException): File does not exist: /user/hduser/collector/test.txt
        at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:61)
        at org.apache.hadoop.hdfs.server.namenode.INodeFile.valueOf(INodeFile.java:51)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsUpdateTimes(FSNamesystem.java:1499)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocationsInt(FSNamesystem.java:1448)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1428)
        at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getBlockLocations(FSNamesystem.java:1402)
        at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getBlockLocations(NameNodeRpcServer.java:468)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getBlockLocations(ClientNamenodeProtocolServerSideTranslatorPB.java:269)
        at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:59566)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
        at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Unknown Source)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.ipc.Client.call(Client.java:1347)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.ipc.Client.call(Client.java:1300)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
2014-03-13 14:08:25 STDIO [ERROR] at com.sun.proxy.$Proxy17.getBlockLocations(Unknown Source)
2014-03-13 14:08:25 STDIO [ERROR] at sun.reflect.GeneratedMethodAccessor34.invoke(Unknown Source)
2014-03-13 14:08:25 STDIO [ERROR] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
2014-03-13 14:08:25 STDIO [ERROR] at java.lang.reflect.Method.invoke(Method.java:606)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
2014-03-13 14:08:25 STDIO [ERROR] at com.sun.proxy.$Proxy17.getBlockLocations(Unknown Source)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getBlockLocations(ClientNamenodeProtocolTranslatorPB.java:188)
2014-03-13 14:08:25 STDIO [ERROR] at org.apache.hadoop.hdfs.DFSClient.callGetBlockLocations(DFSClient.java:1064)
2014-03-13 14:08:25 STDIO [ERROR] ... 24 more


In my client I have ftp files from remote ftp server and put it in hdfs system in path: /user/hduser/collector. Then we send this file name to our hdfs file reading client and gives above exception.


Prompt help is really appreciated :)

BR,
Satyam