You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by Giri P <gp...@gmail.com> on 2015/07/09 01:19:49 UTC

Re: tools.DistCp: Invalid arguments

will distcp do checksum after it copied the data to target ?

On Tue, Feb 3, 2015 at 4:15 AM, Artem Ervits <ar...@gmail.com> wrote:

> Another good option is hftp.
>
> Artem Ervits
> On Feb 3, 2015 6:42 AM, "xeonmailinglist" <xe...@gmail.com>
> wrote:
>
>>  I have found the problem. I started to use `webhdfs` and everything is
>> ok.
>>
>>
>> On 03-02-2015 10:40, xeonmailinglist wrote:
>>
>>  What do you mean by no path is given? Even if I launch this command, I
>> get the same error…. What path should I put here?
>>
>> $ hadoop distcp hdfs://hadoop-coc-1:50070/input1
>> hdfs://hadoop-coc-2:50070/input1
>>
>> Thanks,
>>
>> On 02-02-2015 19:59, Alexander Alten-Lorenz wrote:
>>
>> Have a closer look:
>>
>>    hdfs://hadoop-coc-2:50070/
>>
>>
>>  No Path is given.
>>
>>
>>  On 02 Feb 2015, at 20:52, xeonmailinglist <xe...@gmail.com>
>> wrote:
>>
>>   Hi,
>>
>> I am trying to copy data using distcp but I get this error. Both hadoop
>> runtime are working properly. Why is this happening?
>>
>>
>> vagrant@hadoop-coc-1:~/Programs/hadoop$ hadoop distcp hdfs://hadoop-coc-1:50070/input1 hdfs://hadoop-coc-2:50070/
>> 15/02/02 19:46:37 ERROR tools.DistCp: Invalid arguments:
>> java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772)
>>     at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>     at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
>>     at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
>>     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>     at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
>>     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1988)
>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
>>     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>>     at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
>>     at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)
>>     at org.apache.hadoop.tools.DistCp.setTargetPathExists(DistCp.java:188)
>>     at org.apache.hadoop.tools.DistCp.run(DistCp.java:111)
>>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>     at org.apache.hadoop.tools.DistCp.main(DistCp.java:401)
>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.
>>     at com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:94)
>>     at com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>     at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:202)
>>     at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:241)
>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:253)
>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:259)
>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:49)
>>     at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcHeaderProtos.java:3167)
>>     at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1072)
>>     at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>> Invalid arguments: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>> usage: distcp OPTIONS [source_path...] <target_path>
>>
>> Thanks,
>> ​
>>
>>
>>  ​
>>
>>
>>

Re: tools.DistCp: Invalid arguments

Posted by Harsh J <ha...@cloudera.com>.
No, you don't need to recheck.

On Saturday, July 11, 2015, Giri P <gp...@gmail.com> wrote:

> so ,there is no need to explicitly do checksum on source and target after
> we migrate.
>
> I was thinking of using comparechecksum in distcpUtils to compare after
> migration
>
> On Fri, Jul 10, 2015 at 4:20 PM, Harsh J <harsh@cloudera.com
> <javascript:_e(%7B%7D,'cvml','harsh@cloudera.com');>> wrote:
>
>> Yes, if the length matches and if you haven't specifically asked it to
>> ignore checksums.
>>
>>
>> On Thursday, July 9, 2015, Giri P <gpatcham@gmail.com
>> <javascript:_e(%7B%7D,'cvml','gpatcham@gmail.com');>> wrote:
>>
>>> will distcp do checksum after it copied the data to target ?
>>>
>>> On Tue, Feb 3, 2015 at 4:15 AM, Artem Ervits <ar...@gmail.com>
>>> wrote:
>>>
>>>> Another good option is hftp.
>>>>
>>>> Artem Ervits
>>>> On Feb 3, 2015 6:42 AM, "xeonmailinglist" <xe...@gmail.com>
>>>> wrote:
>>>>
>>>>>  I have found the problem. I started to use `webhdfs` and everything
>>>>> is ok.
>>>>>
>>>>>
>>>>> On 03-02-2015 10:40, xeonmailinglist wrote:
>>>>>
>>>>>  What do you mean by no path is given? Even if I launch this command,
>>>>> I get the same error…. What path should I put here?
>>>>>
>>>>> $ hadoop distcp hdfs://hadoop-coc-1:50070/input1
>>>>> hdfs://hadoop-coc-2:50070/input1
>>>>>
>>>>> Thanks,
>>>>>
>>>>> On 02-02-2015 19:59, Alexander Alten-Lorenz wrote:
>>>>>
>>>>> Have a closer look:
>>>>>
>>>>>    hdfs://hadoop-coc-2:50070/
>>>>>
>>>>>
>>>>>  No Path is given.
>>>>>
>>>>>
>>>>>  On 02 Feb 2015, at 20:52, xeonmailinglist <xe...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>   Hi,
>>>>>
>>>>> I am trying to copy data using distcp but I get this error. Both
>>>>> hadoop runtime are working properly. Why is this happening?
>>>>>
>>>>>
>>>>> vagrant@hadoop-coc-1:~/Programs/hadoop$ hadoop distcp hdfs://hadoop-coc-1:50070/input1 hdfs://hadoop-coc-2:50070/
>>>>> 15/02/02 19:46:37 ERROR tools.DistCp: Invalid arguments:
>>>>> java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>>>>     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772)
>>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>>>>     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
>>>>>     at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
>>>>>     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
>>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
>>>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>>>>     at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
>>>>>     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1988)
>>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
>>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
>>>>>     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
>>>>>     at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)
>>>>>     at org.apache.hadoop.tools.DistCp.setTargetPathExists(DistCp.java:188)
>>>>>     at org.apache.hadoop.tools.DistCp.run(DistCp.java:111)
>>>>>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>>>     at org.apache.hadoop.tools.DistCp.main(DistCp.java:401)
>>>>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.
>>>>>     at com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:94)
>>>>>     at com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>>>>     at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:202)
>>>>>     at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:241)
>>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:253)
>>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:259)
>>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:49)
>>>>>     at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcHeaderProtos.java:3167)
>>>>>     at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1072)
>>>>>     at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>>>> Invalid arguments: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>>>> usage: distcp OPTIONS [source_path...] <target_path>
>>>>>
>>>>> Thanks,
>>>>> ​
>>>>>
>>>>>
>>>>>  ​
>>>>>
>>>>>
>>>>>
>>>
>>
>> --
>> Harsh J
>>
>
>

-- 
Harsh J

Re: tools.DistCp: Invalid arguments

Posted by Harsh J <ha...@cloudera.com>.
No, you don't need to recheck.

On Saturday, July 11, 2015, Giri P <gp...@gmail.com> wrote:

> so ,there is no need to explicitly do checksum on source and target after
> we migrate.
>
> I was thinking of using comparechecksum in distcpUtils to compare after
> migration
>
> On Fri, Jul 10, 2015 at 4:20 PM, Harsh J <harsh@cloudera.com
> <javascript:_e(%7B%7D,'cvml','harsh@cloudera.com');>> wrote:
>
>> Yes, if the length matches and if you haven't specifically asked it to
>> ignore checksums.
>>
>>
>> On Thursday, July 9, 2015, Giri P <gpatcham@gmail.com
>> <javascript:_e(%7B%7D,'cvml','gpatcham@gmail.com');>> wrote:
>>
>>> will distcp do checksum after it copied the data to target ?
>>>
>>> On Tue, Feb 3, 2015 at 4:15 AM, Artem Ervits <ar...@gmail.com>
>>> wrote:
>>>
>>>> Another good option is hftp.
>>>>
>>>> Artem Ervits
>>>> On Feb 3, 2015 6:42 AM, "xeonmailinglist" <xe...@gmail.com>
>>>> wrote:
>>>>
>>>>>  I have found the problem. I started to use `webhdfs` and everything
>>>>> is ok.
>>>>>
>>>>>
>>>>> On 03-02-2015 10:40, xeonmailinglist wrote:
>>>>>
>>>>>  What do you mean by no path is given? Even if I launch this command,
>>>>> I get the same error…. What path should I put here?
>>>>>
>>>>> $ hadoop distcp hdfs://hadoop-coc-1:50070/input1
>>>>> hdfs://hadoop-coc-2:50070/input1
>>>>>
>>>>> Thanks,
>>>>>
>>>>> On 02-02-2015 19:59, Alexander Alten-Lorenz wrote:
>>>>>
>>>>> Have a closer look:
>>>>>
>>>>>    hdfs://hadoop-coc-2:50070/
>>>>>
>>>>>
>>>>>  No Path is given.
>>>>>
>>>>>
>>>>>  On 02 Feb 2015, at 20:52, xeonmailinglist <xe...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>   Hi,
>>>>>
>>>>> I am trying to copy data using distcp but I get this error. Both
>>>>> hadoop runtime are working properly. Why is this happening?
>>>>>
>>>>>
>>>>> vagrant@hadoop-coc-1:~/Programs/hadoop$ hadoop distcp hdfs://hadoop-coc-1:50070/input1 hdfs://hadoop-coc-2:50070/
>>>>> 15/02/02 19:46:37 ERROR tools.DistCp: Invalid arguments:
>>>>> java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>>>>     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772)
>>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>>>>     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
>>>>>     at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
>>>>>     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
>>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
>>>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>>>>     at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
>>>>>     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1988)
>>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
>>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
>>>>>     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
>>>>>     at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)
>>>>>     at org.apache.hadoop.tools.DistCp.setTargetPathExists(DistCp.java:188)
>>>>>     at org.apache.hadoop.tools.DistCp.run(DistCp.java:111)
>>>>>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>>>     at org.apache.hadoop.tools.DistCp.main(DistCp.java:401)
>>>>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.
>>>>>     at com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:94)
>>>>>     at com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>>>>     at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:202)
>>>>>     at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:241)
>>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:253)
>>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:259)
>>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:49)
>>>>>     at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcHeaderProtos.java:3167)
>>>>>     at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1072)
>>>>>     at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>>>> Invalid arguments: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>>>> usage: distcp OPTIONS [source_path...] <target_path>
>>>>>
>>>>> Thanks,
>>>>> ​
>>>>>
>>>>>
>>>>>  ​
>>>>>
>>>>>
>>>>>
>>>
>>
>> --
>> Harsh J
>>
>
>

-- 
Harsh J

Re: tools.DistCp: Invalid arguments

Posted by Harsh J <ha...@cloudera.com>.
No, you don't need to recheck.

On Saturday, July 11, 2015, Giri P <gp...@gmail.com> wrote:

> so ,there is no need to explicitly do checksum on source and target after
> we migrate.
>
> I was thinking of using comparechecksum in distcpUtils to compare after
> migration
>
> On Fri, Jul 10, 2015 at 4:20 PM, Harsh J <harsh@cloudera.com
> <javascript:_e(%7B%7D,'cvml','harsh@cloudera.com');>> wrote:
>
>> Yes, if the length matches and if you haven't specifically asked it to
>> ignore checksums.
>>
>>
>> On Thursday, July 9, 2015, Giri P <gpatcham@gmail.com
>> <javascript:_e(%7B%7D,'cvml','gpatcham@gmail.com');>> wrote:
>>
>>> will distcp do checksum after it copied the data to target ?
>>>
>>> On Tue, Feb 3, 2015 at 4:15 AM, Artem Ervits <ar...@gmail.com>
>>> wrote:
>>>
>>>> Another good option is hftp.
>>>>
>>>> Artem Ervits
>>>> On Feb 3, 2015 6:42 AM, "xeonmailinglist" <xe...@gmail.com>
>>>> wrote:
>>>>
>>>>>  I have found the problem. I started to use `webhdfs` and everything
>>>>> is ok.
>>>>>
>>>>>
>>>>> On 03-02-2015 10:40, xeonmailinglist wrote:
>>>>>
>>>>>  What do you mean by no path is given? Even if I launch this command,
>>>>> I get the same error…. What path should I put here?
>>>>>
>>>>> $ hadoop distcp hdfs://hadoop-coc-1:50070/input1
>>>>> hdfs://hadoop-coc-2:50070/input1
>>>>>
>>>>> Thanks,
>>>>>
>>>>> On 02-02-2015 19:59, Alexander Alten-Lorenz wrote:
>>>>>
>>>>> Have a closer look:
>>>>>
>>>>>    hdfs://hadoop-coc-2:50070/
>>>>>
>>>>>
>>>>>  No Path is given.
>>>>>
>>>>>
>>>>>  On 02 Feb 2015, at 20:52, xeonmailinglist <xe...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>   Hi,
>>>>>
>>>>> I am trying to copy data using distcp but I get this error. Both
>>>>> hadoop runtime are working properly. Why is this happening?
>>>>>
>>>>>
>>>>> vagrant@hadoop-coc-1:~/Programs/hadoop$ hadoop distcp hdfs://hadoop-coc-1:50070/input1 hdfs://hadoop-coc-2:50070/
>>>>> 15/02/02 19:46:37 ERROR tools.DistCp: Invalid arguments:
>>>>> java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>>>>     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772)
>>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>>>>     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
>>>>>     at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
>>>>>     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
>>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
>>>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>>>>     at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
>>>>>     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1988)
>>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
>>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
>>>>>     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
>>>>>     at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)
>>>>>     at org.apache.hadoop.tools.DistCp.setTargetPathExists(DistCp.java:188)
>>>>>     at org.apache.hadoop.tools.DistCp.run(DistCp.java:111)
>>>>>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>>>     at org.apache.hadoop.tools.DistCp.main(DistCp.java:401)
>>>>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.
>>>>>     at com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:94)
>>>>>     at com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>>>>     at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:202)
>>>>>     at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:241)
>>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:253)
>>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:259)
>>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:49)
>>>>>     at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcHeaderProtos.java:3167)
>>>>>     at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1072)
>>>>>     at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>>>> Invalid arguments: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>>>> usage: distcp OPTIONS [source_path...] <target_path>
>>>>>
>>>>> Thanks,
>>>>> ​
>>>>>
>>>>>
>>>>>  ​
>>>>>
>>>>>
>>>>>
>>>
>>
>> --
>> Harsh J
>>
>
>

-- 
Harsh J

Re: tools.DistCp: Invalid arguments

Posted by Harsh J <ha...@cloudera.com>.
No, you don't need to recheck.

On Saturday, July 11, 2015, Giri P <gp...@gmail.com> wrote:

> so ,there is no need to explicitly do checksum on source and target after
> we migrate.
>
> I was thinking of using comparechecksum in distcpUtils to compare after
> migration
>
> On Fri, Jul 10, 2015 at 4:20 PM, Harsh J <harsh@cloudera.com
> <javascript:_e(%7B%7D,'cvml','harsh@cloudera.com');>> wrote:
>
>> Yes, if the length matches and if you haven't specifically asked it to
>> ignore checksums.
>>
>>
>> On Thursday, July 9, 2015, Giri P <gpatcham@gmail.com
>> <javascript:_e(%7B%7D,'cvml','gpatcham@gmail.com');>> wrote:
>>
>>> will distcp do checksum after it copied the data to target ?
>>>
>>> On Tue, Feb 3, 2015 at 4:15 AM, Artem Ervits <ar...@gmail.com>
>>> wrote:
>>>
>>>> Another good option is hftp.
>>>>
>>>> Artem Ervits
>>>> On Feb 3, 2015 6:42 AM, "xeonmailinglist" <xe...@gmail.com>
>>>> wrote:
>>>>
>>>>>  I have found the problem. I started to use `webhdfs` and everything
>>>>> is ok.
>>>>>
>>>>>
>>>>> On 03-02-2015 10:40, xeonmailinglist wrote:
>>>>>
>>>>>  What do you mean by no path is given? Even if I launch this command,
>>>>> I get the same error…. What path should I put here?
>>>>>
>>>>> $ hadoop distcp hdfs://hadoop-coc-1:50070/input1
>>>>> hdfs://hadoop-coc-2:50070/input1
>>>>>
>>>>> Thanks,
>>>>>
>>>>> On 02-02-2015 19:59, Alexander Alten-Lorenz wrote:
>>>>>
>>>>> Have a closer look:
>>>>>
>>>>>    hdfs://hadoop-coc-2:50070/
>>>>>
>>>>>
>>>>>  No Path is given.
>>>>>
>>>>>
>>>>>  On 02 Feb 2015, at 20:52, xeonmailinglist <xe...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>   Hi,
>>>>>
>>>>> I am trying to copy data using distcp but I get this error. Both
>>>>> hadoop runtime are working properly. Why is this happening?
>>>>>
>>>>>
>>>>> vagrant@hadoop-coc-1:~/Programs/hadoop$ hadoop distcp hdfs://hadoop-coc-1:50070/input1 hdfs://hadoop-coc-2:50070/
>>>>> 15/02/02 19:46:37 ERROR tools.DistCp: Invalid arguments:
>>>>> java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>>>>     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772)
>>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>>>>     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
>>>>>     at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
>>>>>     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
>>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
>>>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>>>>     at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
>>>>>     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1988)
>>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
>>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
>>>>>     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
>>>>>     at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)
>>>>>     at org.apache.hadoop.tools.DistCp.setTargetPathExists(DistCp.java:188)
>>>>>     at org.apache.hadoop.tools.DistCp.run(DistCp.java:111)
>>>>>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>>>     at org.apache.hadoop.tools.DistCp.main(DistCp.java:401)
>>>>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.
>>>>>     at com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:94)
>>>>>     at com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>>>>     at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:202)
>>>>>     at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:241)
>>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:253)
>>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:259)
>>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:49)
>>>>>     at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcHeaderProtos.java:3167)
>>>>>     at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1072)
>>>>>     at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>>>> Invalid arguments: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>>>> usage: distcp OPTIONS [source_path...] <target_path>
>>>>>
>>>>> Thanks,
>>>>> ​
>>>>>
>>>>>
>>>>>  ​
>>>>>
>>>>>
>>>>>
>>>
>>
>> --
>> Harsh J
>>
>
>

-- 
Harsh J

Re: tools.DistCp: Invalid arguments

Posted by Giri P <gp...@gmail.com>.
so ,there is no need to explicitly do checksum on source and target after
we migrate.

I was thinking of using comparechecksum in distcpUtils to compare after
migration

On Fri, Jul 10, 2015 at 4:20 PM, Harsh J <ha...@cloudera.com> wrote:

> Yes, if the length matches and if you haven't specifically asked it to
> ignore checksums.
>
>
> On Thursday, July 9, 2015, Giri P <gp...@gmail.com> wrote:
>
>> will distcp do checksum after it copied the data to target ?
>>
>> On Tue, Feb 3, 2015 at 4:15 AM, Artem Ervits <ar...@gmail.com>
>> wrote:
>>
>>> Another good option is hftp.
>>>
>>> Artem Ervits
>>> On Feb 3, 2015 6:42 AM, "xeonmailinglist" <xe...@gmail.com>
>>> wrote:
>>>
>>>>  I have found the problem. I started to use `webhdfs` and everything is
>>>> ok.
>>>>
>>>>
>>>> On 03-02-2015 10:40, xeonmailinglist wrote:
>>>>
>>>>  What do you mean by no path is given? Even if I launch this command,
>>>> I get the same error…. What path should I put here?
>>>>
>>>> $ hadoop distcp hdfs://hadoop-coc-1:50070/input1
>>>> hdfs://hadoop-coc-2:50070/input1
>>>>
>>>> Thanks,
>>>>
>>>> On 02-02-2015 19:59, Alexander Alten-Lorenz wrote:
>>>>
>>>> Have a closer look:
>>>>
>>>>    hdfs://hadoop-coc-2:50070/
>>>>
>>>>
>>>>  No Path is given.
>>>>
>>>>
>>>>  On 02 Feb 2015, at 20:52, xeonmailinglist <xe...@gmail.com>
>>>> wrote:
>>>>
>>>>   Hi,
>>>>
>>>> I am trying to copy data using distcp but I get this error. Both
>>>> hadoop runtime are working properly. Why is this happening?
>>>>
>>>>
>>>> vagrant@hadoop-coc-1:~/Programs/hadoop$ hadoop distcp hdfs://hadoop-coc-1:50070/input1 hdfs://hadoop-coc-2:50070/
>>>> 15/02/02 19:46:37 ERROR tools.DistCp: Invalid arguments:
>>>> java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>>>     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772)
>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>>>     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
>>>>     at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
>>>>     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
>>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>>>     at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
>>>>     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1988)
>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
>>>>     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
>>>>     at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)
>>>>     at org.apache.hadoop.tools.DistCp.setTargetPathExists(DistCp.java:188)
>>>>     at org.apache.hadoop.tools.DistCp.run(DistCp.java:111)
>>>>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>>     at org.apache.hadoop.tools.DistCp.main(DistCp.java:401)
>>>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.
>>>>     at com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:94)
>>>>     at com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>>>     at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:202)
>>>>     at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:241)
>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:253)
>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:259)
>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:49)
>>>>     at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcHeaderProtos.java:3167)
>>>>     at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1072)
>>>>     at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>>> Invalid arguments: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>>> usage: distcp OPTIONS [source_path...] <target_path>
>>>>
>>>> Thanks,
>>>> ​
>>>>
>>>>
>>>>  ​
>>>>
>>>>
>>>>
>>
>
> --
> Harsh J
>

Re: tools.DistCp: Invalid arguments

Posted by Giri P <gp...@gmail.com>.
so ,there is no need to explicitly do checksum on source and target after
we migrate.

I was thinking of using comparechecksum in distcpUtils to compare after
migration

On Fri, Jul 10, 2015 at 4:20 PM, Harsh J <ha...@cloudera.com> wrote:

> Yes, if the length matches and if you haven't specifically asked it to
> ignore checksums.
>
>
> On Thursday, July 9, 2015, Giri P <gp...@gmail.com> wrote:
>
>> will distcp do checksum after it copied the data to target ?
>>
>> On Tue, Feb 3, 2015 at 4:15 AM, Artem Ervits <ar...@gmail.com>
>> wrote:
>>
>>> Another good option is hftp.
>>>
>>> Artem Ervits
>>> On Feb 3, 2015 6:42 AM, "xeonmailinglist" <xe...@gmail.com>
>>> wrote:
>>>
>>>>  I have found the problem. I started to use `webhdfs` and everything is
>>>> ok.
>>>>
>>>>
>>>> On 03-02-2015 10:40, xeonmailinglist wrote:
>>>>
>>>>  What do you mean by no path is given? Even if I launch this command,
>>>> I get the same error…. What path should I put here?
>>>>
>>>> $ hadoop distcp hdfs://hadoop-coc-1:50070/input1
>>>> hdfs://hadoop-coc-2:50070/input1
>>>>
>>>> Thanks,
>>>>
>>>> On 02-02-2015 19:59, Alexander Alten-Lorenz wrote:
>>>>
>>>> Have a closer look:
>>>>
>>>>    hdfs://hadoop-coc-2:50070/
>>>>
>>>>
>>>>  No Path is given.
>>>>
>>>>
>>>>  On 02 Feb 2015, at 20:52, xeonmailinglist <xe...@gmail.com>
>>>> wrote:
>>>>
>>>>   Hi,
>>>>
>>>> I am trying to copy data using distcp but I get this error. Both
>>>> hadoop runtime are working properly. Why is this happening?
>>>>
>>>>
>>>> vagrant@hadoop-coc-1:~/Programs/hadoop$ hadoop distcp hdfs://hadoop-coc-1:50070/input1 hdfs://hadoop-coc-2:50070/
>>>> 15/02/02 19:46:37 ERROR tools.DistCp: Invalid arguments:
>>>> java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>>>     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772)
>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>>>     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
>>>>     at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
>>>>     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
>>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>>>     at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
>>>>     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1988)
>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
>>>>     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
>>>>     at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)
>>>>     at org.apache.hadoop.tools.DistCp.setTargetPathExists(DistCp.java:188)
>>>>     at org.apache.hadoop.tools.DistCp.run(DistCp.java:111)
>>>>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>>     at org.apache.hadoop.tools.DistCp.main(DistCp.java:401)
>>>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.
>>>>     at com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:94)
>>>>     at com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>>>     at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:202)
>>>>     at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:241)
>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:253)
>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:259)
>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:49)
>>>>     at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcHeaderProtos.java:3167)
>>>>     at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1072)
>>>>     at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>>> Invalid arguments: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>>> usage: distcp OPTIONS [source_path...] <target_path>
>>>>
>>>> Thanks,
>>>> ​
>>>>
>>>>
>>>>  ​
>>>>
>>>>
>>>>
>>
>
> --
> Harsh J
>

Re: tools.DistCp: Invalid arguments

Posted by Giri P <gp...@gmail.com>.
so ,there is no need to explicitly do checksum on source and target after
we migrate.

I was thinking of using comparechecksum in distcpUtils to compare after
migration

On Fri, Jul 10, 2015 at 4:20 PM, Harsh J <ha...@cloudera.com> wrote:

> Yes, if the length matches and if you haven't specifically asked it to
> ignore checksums.
>
>
> On Thursday, July 9, 2015, Giri P <gp...@gmail.com> wrote:
>
>> will distcp do checksum after it copied the data to target ?
>>
>> On Tue, Feb 3, 2015 at 4:15 AM, Artem Ervits <ar...@gmail.com>
>> wrote:
>>
>>> Another good option is hftp.
>>>
>>> Artem Ervits
>>> On Feb 3, 2015 6:42 AM, "xeonmailinglist" <xe...@gmail.com>
>>> wrote:
>>>
>>>>  I have found the problem. I started to use `webhdfs` and everything is
>>>> ok.
>>>>
>>>>
>>>> On 03-02-2015 10:40, xeonmailinglist wrote:
>>>>
>>>>  What do you mean by no path is given? Even if I launch this command,
>>>> I get the same error…. What path should I put here?
>>>>
>>>> $ hadoop distcp hdfs://hadoop-coc-1:50070/input1
>>>> hdfs://hadoop-coc-2:50070/input1
>>>>
>>>> Thanks,
>>>>
>>>> On 02-02-2015 19:59, Alexander Alten-Lorenz wrote:
>>>>
>>>> Have a closer look:
>>>>
>>>>    hdfs://hadoop-coc-2:50070/
>>>>
>>>>
>>>>  No Path is given.
>>>>
>>>>
>>>>  On 02 Feb 2015, at 20:52, xeonmailinglist <xe...@gmail.com>
>>>> wrote:
>>>>
>>>>   Hi,
>>>>
>>>> I am trying to copy data using distcp but I get this error. Both
>>>> hadoop runtime are working properly. Why is this happening?
>>>>
>>>>
>>>> vagrant@hadoop-coc-1:~/Programs/hadoop$ hadoop distcp hdfs://hadoop-coc-1:50070/input1 hdfs://hadoop-coc-2:50070/
>>>> 15/02/02 19:46:37 ERROR tools.DistCp: Invalid arguments:
>>>> java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>>>     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772)
>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>>>     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
>>>>     at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
>>>>     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
>>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>>>     at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
>>>>     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1988)
>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
>>>>     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
>>>>     at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)
>>>>     at org.apache.hadoop.tools.DistCp.setTargetPathExists(DistCp.java:188)
>>>>     at org.apache.hadoop.tools.DistCp.run(DistCp.java:111)
>>>>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>>     at org.apache.hadoop.tools.DistCp.main(DistCp.java:401)
>>>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.
>>>>     at com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:94)
>>>>     at com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>>>     at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:202)
>>>>     at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:241)
>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:253)
>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:259)
>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:49)
>>>>     at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcHeaderProtos.java:3167)
>>>>     at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1072)
>>>>     at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>>> Invalid arguments: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>>> usage: distcp OPTIONS [source_path...] <target_path>
>>>>
>>>> Thanks,
>>>> ​
>>>>
>>>>
>>>>  ​
>>>>
>>>>
>>>>
>>
>
> --
> Harsh J
>

Re: tools.DistCp: Invalid arguments

Posted by Giri P <gp...@gmail.com>.
so ,there is no need to explicitly do checksum on source and target after
we migrate.

I was thinking of using comparechecksum in distcpUtils to compare after
migration

On Fri, Jul 10, 2015 at 4:20 PM, Harsh J <ha...@cloudera.com> wrote:

> Yes, if the length matches and if you haven't specifically asked it to
> ignore checksums.
>
>
> On Thursday, July 9, 2015, Giri P <gp...@gmail.com> wrote:
>
>> will distcp do checksum after it copied the data to target ?
>>
>> On Tue, Feb 3, 2015 at 4:15 AM, Artem Ervits <ar...@gmail.com>
>> wrote:
>>
>>> Another good option is hftp.
>>>
>>> Artem Ervits
>>> On Feb 3, 2015 6:42 AM, "xeonmailinglist" <xe...@gmail.com>
>>> wrote:
>>>
>>>>  I have found the problem. I started to use `webhdfs` and everything is
>>>> ok.
>>>>
>>>>
>>>> On 03-02-2015 10:40, xeonmailinglist wrote:
>>>>
>>>>  What do you mean by no path is given? Even if I launch this command,
>>>> I get the same error…. What path should I put here?
>>>>
>>>> $ hadoop distcp hdfs://hadoop-coc-1:50070/input1
>>>> hdfs://hadoop-coc-2:50070/input1
>>>>
>>>> Thanks,
>>>>
>>>> On 02-02-2015 19:59, Alexander Alten-Lorenz wrote:
>>>>
>>>> Have a closer look:
>>>>
>>>>    hdfs://hadoop-coc-2:50070/
>>>>
>>>>
>>>>  No Path is given.
>>>>
>>>>
>>>>  On 02 Feb 2015, at 20:52, xeonmailinglist <xe...@gmail.com>
>>>> wrote:
>>>>
>>>>   Hi,
>>>>
>>>> I am trying to copy data using distcp but I get this error. Both
>>>> hadoop runtime are working properly. Why is this happening?
>>>>
>>>>
>>>> vagrant@hadoop-coc-1:~/Programs/hadoop$ hadoop distcp hdfs://hadoop-coc-1:50070/input1 hdfs://hadoop-coc-2:50070/
>>>> 15/02/02 19:46:37 ERROR tools.DistCp: Invalid arguments:
>>>> java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>>>     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772)
>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>>>     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
>>>>     at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
>>>>     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
>>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>>>     at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
>>>>     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1988)
>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
>>>>     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>>>>     at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
>>>>     at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)
>>>>     at org.apache.hadoop.tools.DistCp.setTargetPathExists(DistCp.java:188)
>>>>     at org.apache.hadoop.tools.DistCp.run(DistCp.java:111)
>>>>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>>     at org.apache.hadoop.tools.DistCp.main(DistCp.java:401)
>>>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.
>>>>     at com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:94)
>>>>     at com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>>>     at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:202)
>>>>     at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:241)
>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:253)
>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:259)
>>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:49)
>>>>     at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcHeaderProtos.java:3167)
>>>>     at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1072)
>>>>     at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>>> Invalid arguments: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>>> usage: distcp OPTIONS [source_path...] <target_path>
>>>>
>>>> Thanks,
>>>> ​
>>>>
>>>>
>>>>  ​
>>>>
>>>>
>>>>
>>
>
> --
> Harsh J
>

Re: tools.DistCp: Invalid arguments

Posted by Harsh J <ha...@cloudera.com>.
Yes, if the length matches and if you haven't specifically asked it to
ignore checksums.

On Thursday, July 9, 2015, Giri P <gp...@gmail.com> wrote:

> will distcp do checksum after it copied the data to target ?
>
> On Tue, Feb 3, 2015 at 4:15 AM, Artem Ervits <artemervits@gmail.com
> <javascript:_e(%7B%7D,'cvml','artemervits@gmail.com');>> wrote:
>
>> Another good option is hftp.
>>
>> Artem Ervits
>> On Feb 3, 2015 6:42 AM, "xeonmailinglist" <xeonmailinglist@gmail.com
>> <javascript:_e(%7B%7D,'cvml','xeonmailinglist@gmail.com');>> wrote:
>>
>>>  I have found the problem. I started to use `webhdfs` and everything is
>>> ok.
>>>
>>>
>>> On 03-02-2015 10:40, xeonmailinglist wrote:
>>>
>>>  What do you mean by no path is given? Even if I launch this command, I
>>> get the same error…. What path should I put here?
>>>
>>> $ hadoop distcp hdfs://hadoop-coc-1:50070/input1
>>> hdfs://hadoop-coc-2:50070/input1
>>>
>>> Thanks,
>>>
>>> On 02-02-2015 19:59, Alexander Alten-Lorenz wrote:
>>>
>>> Have a closer look:
>>>
>>>    hdfs://hadoop-coc-2:50070/
>>>
>>>
>>>  No Path is given.
>>>
>>>
>>>  On 02 Feb 2015, at 20:52, xeonmailinglist <xeonmailinglist@gmail.com
>>> <javascript:_e(%7B%7D,'cvml','xeonmailinglist@gmail.com');>> wrote:
>>>
>>>   Hi,
>>>
>>> I am trying to copy data using distcp but I get this error. Both hadoop
>>> runtime are working properly. Why is this happening?
>>>
>>>
>>> vagrant@hadoop-coc-1:~/Programs/hadoop$ hadoop distcp hdfs://hadoop-coc-1:50070/input1 hdfs://hadoop-coc-2:50070/
>>> 15/02/02 19:46:37 ERROR tools.DistCp: Invalid arguments:
>>> java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>>     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772)
>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>>     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
>>>     at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
>>>     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>>     at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
>>>     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1988)
>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
>>>     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>>>     at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
>>>     at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)
>>>     at org.apache.hadoop.tools.DistCp.setTargetPathExists(DistCp.java:188)
>>>     at org.apache.hadoop.tools.DistCp.run(DistCp.java:111)
>>>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>     at org.apache.hadoop.tools.DistCp.main(DistCp.java:401)
>>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.
>>>     at com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:94)
>>>     at com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>>     at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:202)
>>>     at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:241)
>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:253)
>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:259)
>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:49)
>>>     at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcHeaderProtos.java:3167)
>>>     at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1072)
>>>     at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>> Invalid arguments: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>> usage: distcp OPTIONS [source_path...] <target_path>
>>>
>>> Thanks,
>>> ​
>>>
>>>
>>>  ​
>>>
>>>
>>>
>

-- 
Harsh J

Re: tools.DistCp: Invalid arguments

Posted by Harsh J <ha...@cloudera.com>.
Yes, if the length matches and if you haven't specifically asked it to
ignore checksums.

On Thursday, July 9, 2015, Giri P <gp...@gmail.com> wrote:

> will distcp do checksum after it copied the data to target ?
>
> On Tue, Feb 3, 2015 at 4:15 AM, Artem Ervits <artemervits@gmail.com
> <javascript:_e(%7B%7D,'cvml','artemervits@gmail.com');>> wrote:
>
>> Another good option is hftp.
>>
>> Artem Ervits
>> On Feb 3, 2015 6:42 AM, "xeonmailinglist" <xeonmailinglist@gmail.com
>> <javascript:_e(%7B%7D,'cvml','xeonmailinglist@gmail.com');>> wrote:
>>
>>>  I have found the problem. I started to use `webhdfs` and everything is
>>> ok.
>>>
>>>
>>> On 03-02-2015 10:40, xeonmailinglist wrote:
>>>
>>>  What do you mean by no path is given? Even if I launch this command, I
>>> get the same error…. What path should I put here?
>>>
>>> $ hadoop distcp hdfs://hadoop-coc-1:50070/input1
>>> hdfs://hadoop-coc-2:50070/input1
>>>
>>> Thanks,
>>>
>>> On 02-02-2015 19:59, Alexander Alten-Lorenz wrote:
>>>
>>> Have a closer look:
>>>
>>>    hdfs://hadoop-coc-2:50070/
>>>
>>>
>>>  No Path is given.
>>>
>>>
>>>  On 02 Feb 2015, at 20:52, xeonmailinglist <xeonmailinglist@gmail.com
>>> <javascript:_e(%7B%7D,'cvml','xeonmailinglist@gmail.com');>> wrote:
>>>
>>>   Hi,
>>>
>>> I am trying to copy data using distcp but I get this error. Both hadoop
>>> runtime are working properly. Why is this happening?
>>>
>>>
>>> vagrant@hadoop-coc-1:~/Programs/hadoop$ hadoop distcp hdfs://hadoop-coc-1:50070/input1 hdfs://hadoop-coc-2:50070/
>>> 15/02/02 19:46:37 ERROR tools.DistCp: Invalid arguments:
>>> java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>>     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772)
>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>>     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
>>>     at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
>>>     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>>     at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
>>>     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1988)
>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
>>>     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>>>     at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
>>>     at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)
>>>     at org.apache.hadoop.tools.DistCp.setTargetPathExists(DistCp.java:188)
>>>     at org.apache.hadoop.tools.DistCp.run(DistCp.java:111)
>>>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>     at org.apache.hadoop.tools.DistCp.main(DistCp.java:401)
>>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.
>>>     at com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:94)
>>>     at com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>>     at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:202)
>>>     at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:241)
>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:253)
>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:259)
>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:49)
>>>     at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcHeaderProtos.java:3167)
>>>     at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1072)
>>>     at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>> Invalid arguments: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>> usage: distcp OPTIONS [source_path...] <target_path>
>>>
>>> Thanks,
>>> ​
>>>
>>>
>>>  ​
>>>
>>>
>>>
>

-- 
Harsh J

Re: tools.DistCp: Invalid arguments

Posted by Harsh J <ha...@cloudera.com>.
Yes, if the length matches and if you haven't specifically asked it to
ignore checksums.

On Thursday, July 9, 2015, Giri P <gp...@gmail.com> wrote:

> will distcp do checksum after it copied the data to target ?
>
> On Tue, Feb 3, 2015 at 4:15 AM, Artem Ervits <artemervits@gmail.com
> <javascript:_e(%7B%7D,'cvml','artemervits@gmail.com');>> wrote:
>
>> Another good option is hftp.
>>
>> Artem Ervits
>> On Feb 3, 2015 6:42 AM, "xeonmailinglist" <xeonmailinglist@gmail.com
>> <javascript:_e(%7B%7D,'cvml','xeonmailinglist@gmail.com');>> wrote:
>>
>>>  I have found the problem. I started to use `webhdfs` and everything is
>>> ok.
>>>
>>>
>>> On 03-02-2015 10:40, xeonmailinglist wrote:
>>>
>>>  What do you mean by no path is given? Even if I launch this command, I
>>> get the same error…. What path should I put here?
>>>
>>> $ hadoop distcp hdfs://hadoop-coc-1:50070/input1
>>> hdfs://hadoop-coc-2:50070/input1
>>>
>>> Thanks,
>>>
>>> On 02-02-2015 19:59, Alexander Alten-Lorenz wrote:
>>>
>>> Have a closer look:
>>>
>>>    hdfs://hadoop-coc-2:50070/
>>>
>>>
>>>  No Path is given.
>>>
>>>
>>>  On 02 Feb 2015, at 20:52, xeonmailinglist <xeonmailinglist@gmail.com
>>> <javascript:_e(%7B%7D,'cvml','xeonmailinglist@gmail.com');>> wrote:
>>>
>>>   Hi,
>>>
>>> I am trying to copy data using distcp but I get this error. Both hadoop
>>> runtime are working properly. Why is this happening?
>>>
>>>
>>> vagrant@hadoop-coc-1:~/Programs/hadoop$ hadoop distcp hdfs://hadoop-coc-1:50070/input1 hdfs://hadoop-coc-2:50070/
>>> 15/02/02 19:46:37 ERROR tools.DistCp: Invalid arguments:
>>> java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>>     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772)
>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>>     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
>>>     at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
>>>     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>>     at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
>>>     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1988)
>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
>>>     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>>>     at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
>>>     at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)
>>>     at org.apache.hadoop.tools.DistCp.setTargetPathExists(DistCp.java:188)
>>>     at org.apache.hadoop.tools.DistCp.run(DistCp.java:111)
>>>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>     at org.apache.hadoop.tools.DistCp.main(DistCp.java:401)
>>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.
>>>     at com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:94)
>>>     at com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>>     at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:202)
>>>     at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:241)
>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:253)
>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:259)
>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:49)
>>>     at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcHeaderProtos.java:3167)
>>>     at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1072)
>>>     at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>> Invalid arguments: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>> usage: distcp OPTIONS [source_path...] <target_path>
>>>
>>> Thanks,
>>> ​
>>>
>>>
>>>  ​
>>>
>>>
>>>
>

-- 
Harsh J

Re: tools.DistCp: Invalid arguments

Posted by Harsh J <ha...@cloudera.com>.
Yes, if the length matches and if you haven't specifically asked it to
ignore checksums.

On Thursday, July 9, 2015, Giri P <gp...@gmail.com> wrote:

> will distcp do checksum after it copied the data to target ?
>
> On Tue, Feb 3, 2015 at 4:15 AM, Artem Ervits <artemervits@gmail.com
> <javascript:_e(%7B%7D,'cvml','artemervits@gmail.com');>> wrote:
>
>> Another good option is hftp.
>>
>> Artem Ervits
>> On Feb 3, 2015 6:42 AM, "xeonmailinglist" <xeonmailinglist@gmail.com
>> <javascript:_e(%7B%7D,'cvml','xeonmailinglist@gmail.com');>> wrote:
>>
>>>  I have found the problem. I started to use `webhdfs` and everything is
>>> ok.
>>>
>>>
>>> On 03-02-2015 10:40, xeonmailinglist wrote:
>>>
>>>  What do you mean by no path is given? Even if I launch this command, I
>>> get the same error…. What path should I put here?
>>>
>>> $ hadoop distcp hdfs://hadoop-coc-1:50070/input1
>>> hdfs://hadoop-coc-2:50070/input1
>>>
>>> Thanks,
>>>
>>> On 02-02-2015 19:59, Alexander Alten-Lorenz wrote:
>>>
>>> Have a closer look:
>>>
>>>    hdfs://hadoop-coc-2:50070/
>>>
>>>
>>>  No Path is given.
>>>
>>>
>>>  On 02 Feb 2015, at 20:52, xeonmailinglist <xeonmailinglist@gmail.com
>>> <javascript:_e(%7B%7D,'cvml','xeonmailinglist@gmail.com');>> wrote:
>>>
>>>   Hi,
>>>
>>> I am trying to copy data using distcp but I get this error. Both hadoop
>>> runtime are working properly. Why is this happening?
>>>
>>>
>>> vagrant@hadoop-coc-1:~/Programs/hadoop$ hadoop distcp hdfs://hadoop-coc-1:50070/input1 hdfs://hadoop-coc-2:50070/
>>> 15/02/02 19:46:37 ERROR tools.DistCp: Invalid arguments:
>>> java.io.IOException: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>>     at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772)
>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1472)
>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1399)
>>>     at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
>>>     at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
>>>     at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:752)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>     at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>     at java.lang.reflect.Method.invoke(Method.java:606)
>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
>>>     at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
>>>     at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
>>>     at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1988)
>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1118)
>>>     at org.apache.hadoop.hdfs.DistributedFileSystem$18.doCall(DistributedFileSystem.java:1114)
>>>     at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
>>>     at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1114)
>>>     at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1400)
>>>     at org.apache.hadoop.tools.DistCp.setTargetPathExists(DistCp.java:188)
>>>     at org.apache.hadoop.tools.DistCp.run(DistCp.java:111)
>>>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
>>>     at org.apache.hadoop.tools.DistCp.main(DistCp.java:401)
>>> Caused by: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.
>>>     at com.google.protobuf.InvalidProtocolBufferException.invalidEndTag(InvalidProtocolBufferException.java:94)
>>>     at com.google.protobuf.CodedInputStream.checkLastTagWas(CodedInputStream.java:124)
>>>     at com.google.protobuf.AbstractParser.parsePartialFrom(AbstractParser.java:202)
>>>     at com.google.protobuf.AbstractParser.parsePartialDelimitedFrom(AbstractParser.java:241)
>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:253)
>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:259)
>>>     at com.google.protobuf.AbstractParser.parseDelimitedFrom(AbstractParser.java:49)
>>>     at org.apache.hadoop.ipc.protobuf.RpcHeaderProtos$RpcResponseHeaderProto.parseDelimitedFrom(RpcHeaderProtos.java:3167)
>>>     at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1072)
>>>     at org.apache.hadoop.ipc.Client$Connection.run(Client.java:966)
>>> Invalid arguments: Failed on local exception: com.google.protobuf.InvalidProtocolBufferException: Protocol message end-group tag did not match expected tag.; Host Details : local host is: "hadoop-coc-1/127.0.1.1"; destination host is: "hadoop-coc-2":50070;
>>> usage: distcp OPTIONS [source_path...] <target_path>
>>>
>>> Thanks,
>>> ​
>>>
>>>
>>>  ​
>>>
>>>
>>>
>

-- 
Harsh J