You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by akshay naidu <ak...@gmail.com> on 2019/05/23 12:26:52 UTC

Hadoop distcp failing.

Hello Users,
I'm trying to copy data from one hadoop cluster in london to another
cluster in singapore.
I'm using distCp for the first time. For test purpose I've created a hadoop
cluster on each data center.
using following command for distCp:-

> hadoop distcp
> hftp://123.45.672:54310/data-analytics/strike/myLogs/today/815_19_104_150_2019-05-22.access.log.gz
> hdfs://987.65.43.21:50070/distCp/


 Getting following error :-
19/05/23 12:07:27 INFO tools.OptionsParser: parseChunkSize: blocksperchunk
false
19/05/23 12:07:48 INFO ipc.Client: Retrying connect to server:
li868-219.members.linode.com/ 987.65.43.21:8020. Already tried 0 time(s);
maxRetries=45
.
.
19/05/23 12:22:29 INFO ipc.Client: Retrying connect to server:
li868-219.members.linode.com/139.162.27.219:8020. Already tried 44 time(s);
maxRetries=45
19/05/23 12:22:49 ERROR tools.DistCp: Invalid arguments:
org.apache.hadoop.net.ConnectTimeoutException: Call From HDP-master/
192.168.203.188 to li868-219.members.linode.com:8020 failed on socket
timeout exception: org.apache.hadoop.net.ConnectTimeoutException: 20000
millis timeout while waiting for channel to be ready for connect. ch :
java.nio.channels.SocketChannel[connection-pending remote=
li868-219.members.linode.com/139.162.27.219:8020]; For more details see:
http://wiki.apache.org/hadoop/SocketTimeout
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
Method)
        at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
        at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
        at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
        at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:824)
        at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:774)
        at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1497)
        at org.apache.hadoop.ipc.Client.call(Client.java:1439)
        at org.apache.hadoop.ipc.Client.call(Client.java:1349)
        at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:227)
        at
org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)
        at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source)
        at
org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:796)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422)
        at
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165)
        at
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157)
        at
org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95)
        at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359)
        at com.sun.proxy.$Proxy11.getFileInfo(Unknown Source)
        at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1717)
        at
org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1526)
        at
org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1523)
        at
org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
        at
org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1523)
        at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1627)
        at
org.apache.hadoop.tools.DistCp.setTargetPathExists(DistCp.java:234)
        at org.apache.hadoop.tools.DistCp.run(DistCp.java:138)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
        at org.apache.hadoop.tools.DistCp.main(DistCp.java:519)
Caused by: org.apache.hadoop.net.ConnectTimeoutException: 20000 millis
timeout while waiting for channel to be ready for connect. ch :
java.nio.channels.SocketChannel[connection-pending remote=
li868-219.members.linode.com/139.162.27.219:8020]
        at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:534)
        at
org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:687)
        at
org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:790)
        at
org.apache.hadoop.ipc.Client$Connection.access$3500(Client.java:411)
        at org.apache.hadoop.ipc.Client.getConnection(Client.java:1554)
        at org.apache.hadoop.ipc.Client.call(Client.java:1385)
        ... 25 more
Invalid arguments: Call From HDP-master/192.168.203.188 to
li868-219.members.linode.com:8020 failed on socket timeout exception:
org.apache.hadoop.net.ConnectTimeoutException: 20000 millis timeout while
waiting for channel to be ready for connect. ch :
java.nio.channels.SocketChannel[connection-pending remote=
li868-219.members.linode.com/139.162.27.219:8020]; For more details see:
http://wiki.apache.org/hadoop/SocketTimeout


Any guidance or hint would be very helpful. Thanks.