You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Karthik Kumar <ka...@gmail.com> on 2011/01/26 09:58:24 UTC

Cannot copy files to HDFS

Hi,

I am new to Hadoop. I am using Hadoop 0.20.2 version. I tried to copy a file
of size 300 MB from local to HDFS. It showed the error as below. Please help
me in solving this issue.

11/01/26 13:01:52 WARN hdfs.DFSClient: DataStreamer Exception:
java.io.IOException: An existing connection was forcibly closed by the
remote host
        at sun.nio.ch.SocketDispatcher.write0(Native Method)
        at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:33)
        at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:104)
        at sun.nio.ch.IOUtil.write(IOUtil.java:75)
        at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:334)
        at
org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:55)
        at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
        at
org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:146)
        at
org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:107)
        at java.io.BufferedOutputStream.write(BufferedOutputStream.java:105)
        at java.io.DataOutputStream.write(DataOutputStream.java:90)
        at
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2314)

11/01/26 13:01:52 WARN hdfs.DFSClient: Error Recovery for block
blk_4184614741505116937_1012 bad datanode[0] 160.110.184.114:50010
11/01/26 13:01:52 WARN hdfs.DFSClient: Error Recovery for block
blk_4184614741505116937_1012 in pipeline 160.110.184.114:50010,
160.110.184.111:50010: bad datanode 160.110.184.114:50010
11/01/26 13:01:55 WARN hdfs.DFSClient: Error Recovery for block
blk_4184614741505116937_1012 failed  because recovery from primary datanode
160.110.184.111:50010 failed 1 times.  Pipeline was 160.110.184.114:50010,
160.110.184.111:50010. Will retry...
11/01/26 13:01:55 WARN hdfs.DFSClient: Error Recovery for block
blk_4184614741505116937_1012 bad datanode[0] 160.110.184.114:50010
11/01/26 13:01:55 WARN hdfs.DFSClient: Error Recovery for block
blk_4184614741505116937_1012 in pipeline 160.110.184.114:50010,
160.110.184.111:50010: bad datanode 160.110.184.114:50010
11/01/26 13:02:28 WARN hdfs.DFSClient: DataStreamer Exception:
java.io.IOException: An existing connection was forcibly closed by the
remote host
        at sun.nio.ch.SocketDispatcher.write0(Native Method)
        at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:33)
        at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:104)
        at sun.nio.ch.IOUtil.write(IOUtil.java:75)
        at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:334)
        at
org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:55)
        at
org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
        at
org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:146)
        at
org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:107)
        at java.io.BufferedOutputStream.write(BufferedOutputStream.java:105)
        at java.io.DataOutputStream.write(DataOutputStream.java:90)
        at
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2314)

11/01/26 13:02:28 WARN hdfs.DFSClient: Error Recovery for block
blk_4184614741505116937_1013 bad datanode[0] 160.110.184.111:50010
copyFromLocal: All datanodes 160.110.184.111:50010 are bad. Aborting...
11/01/26 13:02:28 ERROR hdfs.DFSClient: Exception closing file
/hdfs/data/input/cdr10M.csv : java.io.IOException: All datanodes
160.110.184.111:50010 are bad. Aborting...
java.io.IOException: All datanodes 160.110.184.111:50010 are bad.
Aborting...
        at
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.processDatanodeError(DFSClient.java:2556)
        at
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$1600(DFSClient.java:2102)
        at
org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2265)


-- 
With Regards,
Karthik

Re: Cannot copy files to HDFS

Posted by rahul patodi <pa...@gmail.com>.
Hi,
Your data Node is not up......
please run jps command to check all required daemons are running.........
you can refer http://www.hadoop-tutorial.blogspot.com/


-- 
*Regards*,
Rahul Patodi
Software Engineer,
Impetus Infotech (India) Pvt Ltd,
www.impetus.com
Mob:09907074413

Re: Cannot copy files to HDFS

Posted by li ping <li...@gmail.com>.
Please double check the node is alive. and you have the permission to
connect to.

On Wed, Jan 26, 2011 at 4:58 PM, Karthik Kumar <ka...@gmail.com>wrote:

> Hi,
>
> I am new to Hadoop. I am using Hadoop 0.20.2 version. I tried to copy a
> file
> of size 300 MB from local to HDFS. It showed the error as below. Please
> help
> me in solving this issue.
>
> 11/01/26 13:01:52 WARN hdfs.DFSClient: DataStreamer Exception:
> java.io.IOException: An existing connection was forcibly closed by the
> remote host
>        at sun.nio.ch.SocketDispatcher.write0(Native Method)
>        at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:33)
>        at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:104)
>        at sun.nio.ch.IOUtil.write(IOUtil.java:75)
>        at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:334)
>        at
>
> org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:55)
>        at
>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
>        at
> org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:146)
>        at
> org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:107)
>        at java.io.BufferedOutputStream.write(BufferedOutputStream.java:105)
>        at java.io.DataOutputStream.write(DataOutputStream.java:90)
>        at
>
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2314)
>
> 11/01/26 13:01:52 WARN hdfs.DFSClient: Error Recovery for block
> blk_4184614741505116937_1012 bad datanode[0] 160.110.184.114:50010
> 11/01/26 13:01:52 WARN hdfs.DFSClient: Error Recovery for block
> blk_4184614741505116937_1012 in pipeline 160.110.184.114:50010,
> 160.110.184.111:50010: bad datanode 160.110.184.114:50010
> 11/01/26 13:01:55 WARN hdfs.DFSClient: Error Recovery for block
> blk_4184614741505116937_1012 failed  because recovery from primary datanode
> 160.110.184.111:50010 failed 1 times.  Pipeline was 160.110.184.114:50010,
> 160.110.184.111:50010. Will retry...
> 11/01/26 13:01:55 WARN hdfs.DFSClient: Error Recovery for block
> blk_4184614741505116937_1012 bad datanode[0] 160.110.184.114:50010
> 11/01/26 13:01:55 WARN hdfs.DFSClient: Error Recovery for block
> blk_4184614741505116937_1012 in pipeline 160.110.184.114:50010,
> 160.110.184.111:50010: bad datanode 160.110.184.114:50010
> 11/01/26 13:02:28 WARN hdfs.DFSClient: DataStreamer Exception:
> java.io.IOException: An existing connection was forcibly closed by the
> remote host
>        at sun.nio.ch.SocketDispatcher.write0(Native Method)
>        at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:33)
>        at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:104)
>        at sun.nio.ch.IOUtil.write(IOUtil.java:75)
>        at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:334)
>        at
>
> org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:55)
>        at
>
> org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:142)
>        at
> org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:146)
>        at
> org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:107)
>        at java.io.BufferedOutputStream.write(BufferedOutputStream.java:105)
>        at java.io.DataOutputStream.write(DataOutputStream.java:90)
>        at
>
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2314)
>
> 11/01/26 13:02:28 WARN hdfs.DFSClient: Error Recovery for block
> blk_4184614741505116937_1013 bad datanode[0] 160.110.184.111:50010
> copyFromLocal: All datanodes 160.110.184.111:50010 are bad. Aborting...
> 11/01/26 13:02:28 ERROR hdfs.DFSClient: Exception closing file
> /hdfs/data/input/cdr10M.csv : java.io.IOException: All datanodes
> 160.110.184.111:50010 are bad. Aborting...
> java.io.IOException: All datanodes 160.110.184.111:50010 are bad.
> Aborting...
>        at
>
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.processDatanodeError(DFSClient.java:2556)
>        at
>
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$1600(DFSClient.java:2102)
>        at
>
> org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2265)
>
>
> --
> With Regards,
> Karthik
>



-- 
-----李平