You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by "manas.tomar" <ma...@zoho.com> on 2010/04/17 11:59:38 UTC

Trouble copying local file to hdfs

I have set-up Hadoop on OpenSuse 11.2 VM using Virtualbox. I ran Hadoop examples in the standalone mode successfully.
Now, I want to run in distributed mode using 2 nodes.
Hadoop starts fine and jps lists all the nodes. But when i try to put any file or run any example, I get error. For e.g. :

hadoop@master:~/hadoop> ./bin/hadoop dfs -copyFromLocal ./input inputsample
10/04/17 14:42:46 INFO hdfs.DFSClient: Exception in createBlockOutputStream java.net.SocketException: Operation not supported
10/04/17 14:42:46 INFO hdfs.DFSClient: Abandoning block blk_8951413748418693186_1080                                         
....                                        
10/04/17 14:43:04 INFO hdfs.DFSClient: Exception in createBlockOutputStream java.net.SocketException: Protocol not available 
10/04/17 14:43:04 INFO hdfs.DFSClient: Abandoning block blk_838428157309440632_1081                                          
10/04/17 14:43:10 WARN hdfs.DFSClient: DataStreamer Exception: java.io.IOException: Unable to create new block.            
        at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2845)                       
        at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102)                                 
        at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)                        

10/04/17 14:43:10 WARN hdfs.DFSClient: Error Recovery for block blk_838428157309440632_1081 bad datanode[0] nodes == null
10/04/17 14:43:10 WARN hdfs.DFSClient: Could not get block locations. Source file "/user/hadoop/inputsample/check" - Aborting...
copyFromLocal: Protocol not available                                                                                           
10/04/17 14:43:10 ERROR hdfs.DFSClient: Exception closing file /user/hadoop/inputsample/check : java.net.SocketException: Protocol not available                                                                                                                                              
java.net.SocketException: Protocol not available                                                                                               
        at sun.nio.ch.Net.getIntOption0(Native Method)                                                                                         
        at sun.nio.ch.Net.getIntOption(Net.java:178)                                                                                           
        at sun.nio.ch.SocketChannelImpl$1.getInt(SocketChannelImpl.java:419)                                                                   
        at sun.nio.ch.SocketOptsImpl.getInt(SocketOptsImpl.java:60)                                                                            
        at sun.nio.ch.SocketOptsImpl.sendBufferSize(SocketOptsImpl.java:156)                                                                   
        at sun.nio.ch.SocketOptsImpl$IP$TCP.sendBufferSize(SocketOptsImpl.java:286)                                                            
        at sun.nio.ch.OptionAdaptor.getSendBufferSize(OptionAdaptor.java:129)                                                                  
        at sun.nio.ch.SocketAdaptor.getSendBufferSize(SocketAdaptor.java:328)                                                                  
        at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.createBlockOutputStream(DFSClient.java:2873)                                       
        at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2826)                                         
        at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream.access$2000(DFSClient.java:2102)                                                   
        at org.apache.hadoop.hdfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:2288)


I can see the files on HDFS through the web interface but they are empty.
Any suggestion on how can I get over this ?