You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Jay Vyas <ja...@gmail.com> on 2011/10/31 00:47:33 UTC

getting there (EOF exception).

Hi  guys : What is the meaning of an EOF exception when trying to connect
to Hadoop by creating a new FileSystem object ?  Does this simply mean
the system cant be read ?

java.io.IOException: Call to /172.16.112.131:50070 failed on local
exception: java.io.EOFException
    at org.apache.hadoop.ipc.Client.wrapException(Client.java:1139)
    at org.apache.hadoop.ipc.Client.call(Client.java:1107)
    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
    at $Proxy0.getProtocolVersion(Unknown Source)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:398)
    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:384)
    at
org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:111)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:213)
    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:180)
    at
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
    at
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1514)
    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
    at
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1548)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1530)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:228)
    at sb.HadoopRemote.main(HadoopRemote.java:35)
Caused by: java.io.EOFException
    at java.io.DataInputStream.readInt(DataInputStream.java:375)
    at
org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:812)
    at org.apache.hadoop.ipc.Client$Connection.run(Client.java:720)

-- 
Jay Vyas
MMSB/UCHC

Re: getting there (EOF exception).

Posted by Jay Vyas <ja...@gmail.com>.
Harsh ! that was thetrick !

I changed the fs.deault.name to 0.0.0.0. from "localhost".

Then, my java could easily connect with no problems to my remote hadoop
namenode !!!

Thanks !

In summary --- if you need to connect to the namenode remotely.... make
sure its serving to 0.0.0.0 and
not localhost, and not 127.0.0.1 (for those of you that are ignorant like
me, localhost != 0.0.0.0 ......

thank you thank you thank you

On Mon, Oct 31, 2011 at 12:21 AM, Harsh J <ha...@cloudera.com> wrote:

> What is your fs.default.name set to? It'd bind to the hostname provided
> in that.
>
> On Mon, Oct 31, 2011 at 9:38 AM, JAX <ja...@gmail.com> wrote:
> > Thanks! Yes i agree ... But Are you sure 8020? 8020 serves on 127.0.0.1
> (rather than 0.0.0.0) ... Thus it is inaccessible to outside
> clients.......That is very odd.... Why would that be the case ? Any
> insights ( using cloud eras hadoop vm).
> >
> > Sent from my iPad
> >
> > On Oct 30, 2011, at 11:48 PM, Harsh J <ha...@cloudera.com> wrote:
> >
> >> Hey Jay,
> >>
> >> I believe this may be related to your other issues as well, but 50070
> is NOT the port you want to connect to. 50070 serves over HTTP, while
> default port (fs.default.name), for IPC connections is 8020, or whatever
> you have configured.
> >>
> >> On 31-Oct-2011, at 5:17 AM, Jay Vyas wrote:
> >>
> >>> Hi  guys : What is the meaning of an EOF exception when trying to
> connect
> >>> to Hadoop by creating a new FileSystem object ?  Does this simply mean
> >>> the system cant be read ?
> >>>
> >>> java.io.IOException: Call to /172.16.112.131:50070 failed on local
> >>> exception: java.io.EOFException
> >>>   at org.apache.hadoop.ipc.Client.wrapException(Client.java:1139)
> >>>   at org.apache.hadoop.ipc.Client.call(Client.java:1107)
> >>>   at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
> >>>   at $Proxy0.getProtocolVersion(Unknown Source)
> >>>   at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:398)
> >>>   at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:384)
> >>>   at
> >>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:111)
> >>>   at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:213)
> >>>   at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:180)
> >>>   at
> >>>
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
> >>>   at
> >>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1514)
> >>>   at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
> >>>   at
> >>> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1548)
> >>>   at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1530)
> >>>   at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:228)
> >>>   at sb.HadoopRemote.main(HadoopRemote.java:35)
> >>> Caused by: java.io.EOFException
> >>>   at java.io.DataInputStream.readInt(DataInputStream.java:375)
> >>>   at
> >>>
> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:812)
> >>>   at org.apache.hadoop.ipc.Client$Connection.run(Client.java:720)
> >>>
> >>> --
> >>> Jay Vyas
> >>> MMSB/UCHC
> >>
> >
>
>
>
> --
> Harsh J
>



-- 
Jay Vyas
MMSB/UCHC

Re: getting there (EOF exception).

Posted by Harsh J <ha...@cloudera.com>.
What is your fs.default.name set to? It'd bind to the hostname provided in that.

On Mon, Oct 31, 2011 at 9:38 AM, JAX <ja...@gmail.com> wrote:
> Thanks! Yes i agree ... But Are you sure 8020? 8020 serves on 127.0.0.1 (rather than 0.0.0.0) ... Thus it is inaccessible to outside clients.......That is very odd.... Why would that be the case ? Any insights ( using cloud eras hadoop vm).
>
> Sent from my iPad
>
> On Oct 30, 2011, at 11:48 PM, Harsh J <ha...@cloudera.com> wrote:
>
>> Hey Jay,
>>
>> I believe this may be related to your other issues as well, but 50070 is NOT the port you want to connect to. 50070 serves over HTTP, while default port (fs.default.name), for IPC connections is 8020, or whatever you have configured.
>>
>> On 31-Oct-2011, at 5:17 AM, Jay Vyas wrote:
>>
>>> Hi  guys : What is the meaning of an EOF exception when trying to connect
>>> to Hadoop by creating a new FileSystem object ?  Does this simply mean
>>> the system cant be read ?
>>>
>>> java.io.IOException: Call to /172.16.112.131:50070 failed on local
>>> exception: java.io.EOFException
>>>   at org.apache.hadoop.ipc.Client.wrapException(Client.java:1139)
>>>   at org.apache.hadoop.ipc.Client.call(Client.java:1107)
>>>   at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
>>>   at $Proxy0.getProtocolVersion(Unknown Source)
>>>   at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:398)
>>>   at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:384)
>>>   at
>>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:111)
>>>   at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:213)
>>>   at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:180)
>>>   at
>>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>>>   at
>>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1514)
>>>   at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
>>>   at
>>> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1548)
>>>   at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1530)
>>>   at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:228)
>>>   at sb.HadoopRemote.main(HadoopRemote.java:35)
>>> Caused by: java.io.EOFException
>>>   at java.io.DataInputStream.readInt(DataInputStream.java:375)
>>>   at
>>> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:812)
>>>   at org.apache.hadoop.ipc.Client$Connection.run(Client.java:720)
>>>
>>> --
>>> Jay Vyas
>>> MMSB/UCHC
>>
>



-- 
Harsh J

Re: getting there (EOF exception).

Posted by JAX <ja...@gmail.com>.
Thanks! Yes i agree ... But Are you sure 8020? 8020 serves on 127.0.0.1 (rather than 0.0.0.0) ... Thus it is inaccessible to outside clients.......That is very odd.... Why would that be the case ? Any insights ( using cloud eras hadoop vm).

Sent from my iPad

On Oct 30, 2011, at 11:48 PM, Harsh J <ha...@cloudera.com> wrote:

> Hey Jay,
> 
> I believe this may be related to your other issues as well, but 50070 is NOT the port you want to connect to. 50070 serves over HTTP, while default port (fs.default.name), for IPC connections is 8020, or whatever you have configured.
> 
> On 31-Oct-2011, at 5:17 AM, Jay Vyas wrote:
> 
>> Hi  guys : What is the meaning of an EOF exception when trying to connect
>> to Hadoop by creating a new FileSystem object ?  Does this simply mean
>> the system cant be read ?
>> 
>> java.io.IOException: Call to /172.16.112.131:50070 failed on local
>> exception: java.io.EOFException
>>   at org.apache.hadoop.ipc.Client.wrapException(Client.java:1139)
>>   at org.apache.hadoop.ipc.Client.call(Client.java:1107)
>>   at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
>>   at $Proxy0.getProtocolVersion(Unknown Source)
>>   at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:398)
>>   at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:384)
>>   at
>> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:111)
>>   at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:213)
>>   at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:180)
>>   at
>> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>>   at
>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1514)
>>   at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
>>   at
>> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1548)
>>   at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1530)
>>   at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:228)
>>   at sb.HadoopRemote.main(HadoopRemote.java:35)
>> Caused by: java.io.EOFException
>>   at java.io.DataInputStream.readInt(DataInputStream.java:375)
>>   at
>> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:812)
>>   at org.apache.hadoop.ipc.Client$Connection.run(Client.java:720)
>> 
>> -- 
>> Jay Vyas
>> MMSB/UCHC
> 

Re: getting there (EOF exception).

Posted by Harsh J <ha...@cloudera.com>.
Hey Jay,

I believe this may be related to your other issues as well, but 50070 is NOT the port you want to connect to. 50070 serves over HTTP, while default port (fs.default.name), for IPC connections is 8020, or whatever you have configured.

On 31-Oct-2011, at 5:17 AM, Jay Vyas wrote:

> Hi  guys : What is the meaning of an EOF exception when trying to connect
> to Hadoop by creating a new FileSystem object ?  Does this simply mean
> the system cant be read ?
> 
> java.io.IOException: Call to /172.16.112.131:50070 failed on local
> exception: java.io.EOFException
>    at org.apache.hadoop.ipc.Client.wrapException(Client.java:1139)
>    at org.apache.hadoop.ipc.Client.call(Client.java:1107)
>    at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:226)
>    at $Proxy0.getProtocolVersion(Unknown Source)
>    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:398)
>    at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:384)
>    at
> org.apache.hadoop.hdfs.DFSClient.createRPCNamenode(DFSClient.java:111)
>    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:213)
>    at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:180)
>    at
> org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:89)
>    at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1514)
>    at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
>    at
> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:1548)
>    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1530)
>    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:228)
>    at sb.HadoopRemote.main(HadoopRemote.java:35)
> Caused by: java.io.EOFException
>    at java.io.DataInputStream.readInt(DataInputStream.java:375)
>    at
> org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:812)
>    at org.apache.hadoop.ipc.Client$Connection.run(Client.java:720)
> 
> -- 
> Jay Vyas
> MMSB/UCHC