You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-user@hadoop.apache.org by Ken Been <Ke...@twosigma.com> on 2013/12/18 19:45:07 UTC

compatibility between new client and old server

I am trying to make a 2.2.0 Java client work with a 1.1.2 server.  The error I am currently getting is below.  I'd like to know if my problem is because I have configured something wrong or because the versions are simply not compatible for what I want to do.  Thanks in advance for any help.

Ken

        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
        at org.apache.hadoop.ipc.Client.call(Client.java:1351)
        at org.apache.hadoop.ipc.Client.call(Client.java:1300)
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
        at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
        at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:651)
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1679)
        at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1106)
        at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1102)
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
       at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1102)
        at my code...
Caused by: java.io.EOFException
        at java.io.DataInputStream.readInt(DataInputStream.java:392)
        at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:995)
        at org.apache.hadoop.ipc.Client$Connection.run(Client.java:891)

Re: compatibility between new client and old server

Posted by Suresh Srinivas <su...@hortonworks.com>.
2.x is a new major release. 1.x and 2.x are not compatible.

In 1.x, the RPC wire protocol used java serialization. In 2.x, the RPC wire
protocol uses protobuf. A client must be compiled against 2.x and should
use appropriate jars from 2.x to work with 2.x.


On Wed, Dec 18, 2013 at 10:45 AM, Ken Been <Ke...@twosigma.com> wrote:

>  I am trying to make a 2.2.0 Java client work with a 1.1.2 server.  The
> error I am currently getting is below.  I’d like to know if my problem is
> because I have configured something wrong or because the versions are
> simply not compatible for what I want to do.  Thanks in advance for any
> help.
>
>
>
> Ken
>
>
>
>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
>
>         at org.apache.hadoop.ipc.Client.call(Client.java:1351)
>
>         at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>
>         at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
>         at java.lang.reflect.Method.invoke(Method.java:601)
>
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>
>         at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
>
>         at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:651)
>
>         at
> org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1679)
>
>         at
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1106)
>
>         at
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1102)
>
>         at
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>
>        at
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1102)
>
>         at my code...
>
> Caused by: java.io.EOFException
>
>         at java.io.DataInputStream.readInt(DataInputStream.java:392)
>
>         at
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:995)
>
>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:891)
>



-- 
http://hortonworks.com/download/

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: compatibility between new client and old server

Posted by Suresh Srinivas <su...@hortonworks.com>.
2.x is a new major release. 1.x and 2.x are not compatible.

In 1.x, the RPC wire protocol used java serialization. In 2.x, the RPC wire
protocol uses protobuf. A client must be compiled against 2.x and should
use appropriate jars from 2.x to work with 2.x.


On Wed, Dec 18, 2013 at 10:45 AM, Ken Been <Ke...@twosigma.com> wrote:

>  I am trying to make a 2.2.0 Java client work with a 1.1.2 server.  The
> error I am currently getting is below.  I’d like to know if my problem is
> because I have configured something wrong or because the versions are
> simply not compatible for what I want to do.  Thanks in advance for any
> help.
>
>
>
> Ken
>
>
>
>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
>
>         at org.apache.hadoop.ipc.Client.call(Client.java:1351)
>
>         at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>
>         at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
>         at java.lang.reflect.Method.invoke(Method.java:601)
>
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>
>         at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
>
>         at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:651)
>
>         at
> org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1679)
>
>         at
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1106)
>
>         at
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1102)
>
>         at
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>
>        at
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1102)
>
>         at my code...
>
> Caused by: java.io.EOFException
>
>         at java.io.DataInputStream.readInt(DataInputStream.java:392)
>
>         at
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:995)
>
>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:891)
>



-- 
http://hortonworks.com/download/

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: compatibility between new client and old server

Posted by Suresh Srinivas <su...@hortonworks.com>.
2.x is a new major release. 1.x and 2.x are not compatible.

In 1.x, the RPC wire protocol used java serialization. In 2.x, the RPC wire
protocol uses protobuf. A client must be compiled against 2.x and should
use appropriate jars from 2.x to work with 2.x.


On Wed, Dec 18, 2013 at 10:45 AM, Ken Been <Ke...@twosigma.com> wrote:

>  I am trying to make a 2.2.0 Java client work with a 1.1.2 server.  The
> error I am currently getting is below.  I’d like to know if my problem is
> because I have configured something wrong or because the versions are
> simply not compatible for what I want to do.  Thanks in advance for any
> help.
>
>
>
> Ken
>
>
>
>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
>
>         at org.apache.hadoop.ipc.Client.call(Client.java:1351)
>
>         at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>
>         at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
>         at java.lang.reflect.Method.invoke(Method.java:601)
>
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>
>         at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
>
>         at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:651)
>
>         at
> org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1679)
>
>         at
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1106)
>
>         at
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1102)
>
>         at
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>
>        at
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1102)
>
>         at my code...
>
> Caused by: java.io.EOFException
>
>         at java.io.DataInputStream.readInt(DataInputStream.java:392)
>
>         at
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:995)
>
>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:891)
>



-- 
http://hortonworks.com/download/

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: compatibility between new client and old server

Posted by Suresh Srinivas <su...@hortonworks.com>.
2.x is a new major release. 1.x and 2.x are not compatible.

In 1.x, the RPC wire protocol used java serialization. In 2.x, the RPC wire
protocol uses protobuf. A client must be compiled against 2.x and should
use appropriate jars from 2.x to work with 2.x.


On Wed, Dec 18, 2013 at 10:45 AM, Ken Been <Ke...@twosigma.com> wrote:

>  I am trying to make a 2.2.0 Java client work with a 1.1.2 server.  The
> error I am currently getting is below.  I’d like to know if my problem is
> because I have configured something wrong or because the versions are
> simply not compatible for what I want to do.  Thanks in advance for any
> help.
>
>
>
> Ken
>
>
>
>         at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
>
>         at org.apache.hadoop.ipc.Client.call(Client.java:1351)
>
>         at org.apache.hadoop.ipc.Client.call(Client.java:1300)
>
>         at
> org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
>
>         at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>         at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>
>         at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>
>         at java.lang.reflect.Method.invoke(Method.java:601)
>
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
>
>         at
> org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>
>         at com.sun.proxy.$Proxy18.getFileInfo(Unknown Source)
>
>         at
> org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:651)
>
>         at
> org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1679)
>
>         at
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1106)
>
>         at
> org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1102)
>
>         at
> org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>
>        at
> org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1102)
>
>         at my code...
>
> Caused by: java.io.EOFException
>
>         at java.io.DataInputStream.readInt(DataInputStream.java:392)
>
>         at
> org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:995)
>
>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:891)
>



-- 
http://hortonworks.com/download/

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.