You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@ambari.apache.org by zhangwei <zh...@richinfo.cn> on 2013/08/16 03:51:25 UTC

hdp2.0.5 hdfs programming

HI , Everybody,

       When I used Yarn hdfs (2.0.5), I got this exception :
failed :java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: callId, status; Host Details : local host is: "MICROSOF-8CECED/192.168.50.196"; destination host is: "hdp247.localhost":8020; 
 at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:761)
 at org.apache.hadoop.ipc.Client.call(Client.java:1239)
 at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
 at com.sun.proxy.$Proxy9.mkdirs(Unknown Source)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
 at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
 at java.lang.reflect.Method.invoke(Unknown Source)
 at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
 at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
 at com.sun.proxy.$Proxy9.mkdirs(Unknown Source)
 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:446)
 at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2142)
 at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2113)
 at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:540)
 at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1881)
 at com.richinfo.hdfs.HDFSUtil.mkdirs(HDFSUtil.java:79)
 at com.richinfo.hdfs.HdfsDemo.main(HdfsDemo.java:111)
Caused by: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: callId, status
 at com.google.protobuf.UninitializedMessageException.asInvalidProtocolBufferException(UninitializedMessageException.java:81)
 at org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.buildParsed(RpcPayloadHeaderProtos.java:1094)
 at org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.access$1300(RpcPayloadHeaderProtos.java:1028)
 at org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:986)
 at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:946)
 at org.apache.hadoop.ipc.Client$Connection.run(Client.java:84

I set  config.set("fs.defaultFS", "hdfs://hdp247.localhost:8020");
Can anybody give me a solution?




-----------------------------------------
Best regards!  

Re: hdp2.0.5 hdfs programming

Posted by zhangwei <zh...@richinfo.cn>.
Hi , Hitesh,

           My cluster's version is hdp2.0.4.  I used Yarn hdfs (2.0.4) also get this exception.
-----------------------------------------
Best regards!   

----- Original Message ----- 
From: "Hitesh Shah" <hi...@apache.org>
To: <am...@incubator.apache.org>
Sent: Friday, August 16, 2013 12:32 PM
Subject: Re: hdp2.0.5 hdfs programming


Hi 

Can you confirm that your demo driver is compiled against the same source code as that what is running on your cluster? 

-- Hitesh

On Aug 15, 2013, at 6:51 PM, zhangwei wrote:

> HI , Everybody,
>  
>        When I used Yarn hdfs (2.0.5), I got this exception :
> failed :java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: callId, status; Host Details : local host is: "MICROSOF-8CECED/192.168.50.196"; destination host is: "hdp247.localhost":8020; 
>  at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:761)
>  at org.apache.hadoop.ipc.Client.call(Client.java:1239)
>  at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
>  at com.sun.proxy.$Proxy9.mkdirs(Unknown Source)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>  at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
>  at java.lang.reflect.Method.invoke(Unknown Source)
>  at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
>  at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
>  at com.sun.proxy.$Proxy9.mkdirs(Unknown Source)
>  at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:446)
>  at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2142)
>  at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2113)
>  at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:540)
>  at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1881)
>  at com.richinfo.hdfs.HDFSUtil.mkdirs(HDFSUtil.java:79)
>  at com.richinfo.hdfs.HdfsDemo.main(HdfsDemo.java:111)
> Caused by: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: callId, status
>  at com.google.protobuf.UninitializedMessageException.asInvalidProtocolBufferException(UninitializedMessageException.java:81)
>  at org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.buildParsed(RpcPayloadHeaderProtos.java:1094)
>  at org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.access$1300(RpcPayloadHeaderProtos.java:1028)
>  at org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:986)
>  at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:946)
>  at org.apache.hadoop.ipc.Client$Connection.run(Client.java:84
>  
> I set config.set("fs.defaultFS", "hdfs://hdp247.localhost:8020");
> Can anybody give me a solution?
>  
>  
>  
>  
> -----------------------------------------
> Best regards!  


Re: hdp2.0.5 hdfs programming

Posted by Hitesh Shah <hi...@apache.org>.
Hi 

Can you confirm that your demo driver is compiled against the same source code as that what is running on your cluster? 

-- Hitesh

On Aug 15, 2013, at 6:51 PM, zhangwei wrote:

> HI , Everybody,
>  
>        When I used Yarn hdfs (2.0.5), I got this exception :
> failed :java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: callId, status; Host Details : local host is: "MICROSOF-8CECED/192.168.50.196"; destination host is: "hdp247.localhost":8020; 
>  at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:761)
>  at org.apache.hadoop.ipc.Client.call(Client.java:1239)
>  at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202)
>  at com.sun.proxy.$Proxy9.mkdirs(Unknown Source)
>  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>  at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>  at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
>  at java.lang.reflect.Method.invoke(Unknown Source)
>  at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164)
>  at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83)
>  at com.sun.proxy.$Proxy9.mkdirs(Unknown Source)
>  at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:446)
>  at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2142)
>  at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2113)
>  at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:540)
>  at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1881)
>  at com.richinfo.hdfs.HDFSUtil.mkdirs(HDFSUtil.java:79)
>  at com.richinfo.hdfs.HdfsDemo.main(HdfsDemo.java:111)
> Caused by: com.google.protobuf.InvalidProtocolBufferException: Message missing required fields: callId, status
>  at com.google.protobuf.UninitializedMessageException.asInvalidProtocolBufferException(UninitializedMessageException.java:81)
>  at org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.buildParsed(RpcPayloadHeaderProtos.java:1094)
>  at org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto$Builder.access$1300(RpcPayloadHeaderProtos.java:1028)
>  at org.apache.hadoop.ipc.protobuf.RpcPayloadHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcPayloadHeaderProtos.java:986)
>  at org.apache.hadoop.ipc.Client$Connection.receiveResponse(Client.java:946)
>  at org.apache.hadoop.ipc.Client$Connection.run(Client.java:84
>  
> I set config.set("fs.defaultFS", "hdfs://hdp247.localhost:8020");
> Can anybody give me a solution?
>  
>  
>  
>  
> -----------------------------------------
> Best regards!