You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Joseph Wang <jo...@yahoo.com> on 2006/03/21 03:05:59 UTC

cannot create file when doing dfs -put command

I want to setup two servers 68.14.240.166 and
68.14.240.168 to run example program. I put
the following into hadoop-site.xml and run
bin/start-all.sh on both nodes. However,
when I try to run
   "bin/hadoop dfs -put input input"
to copy file into ndfs, I get an error
stating it cannot create file. I looked
under /tmp/hadoop/dfs. There is no
/tmp/hadoop/dfs/data. The directory is
read & writable by anyone. Any suggestion?

<configuration>
  <property>
    <name>fs.default.name</name>
    <value>68.142.240.166:9000</value>
  </property>
  <property>
    <name>mapred.job.tracker</name>
    <value>68.142.240.166:9001</value>
  </property>
  <property>
    <name>dfs.replication</name>
    <value>1</value>
  </property>
  <property>
    <name>dfs.name.dir</name>
    <value>/tmp/hadoop/dfs/name</value>
  </property>
  <property>
    <name>dfs.data.dir</name>
    <value>/tmp/hadoop/dfs/data</value>
  </property>
<property>
  <name>mapred.local.dir</name>
 
<value>/home/dmhadoop/hadoop-nightly/tmp/hadoop/mapred/local</value>
  <description>The local directory where MapReduce
stores intermediate
  data files.  May be a space- or comma- separated
list of
  directories on different devices in order to spread
disk i/o.
  </description>
</property>
<property>
  <name>mapred.map.tasks</name>
  <value>20</value>
</property>
<property>
  <name>mapred.reduce.tasks</name>
  <value>4</value>
  </description>
</property>

</configuration>
                     



060320 174758 parsing
file:/home/dmhadoop/hadoop-nightly/conf/hadoop-default.xml
060320 174758 parsing
file:/home/dmhadoop/hadoop-nightly/conf/hadoop-site.xml
060320 174758 No FS indicated, using
default:webden304.ysm.den.yahoo.com:9000
060320 174758 Client connection to
68.142.240.166:9000: starting
060320 174758 parsing
file:/home/dmhadoop/hadoop-nightly/conf/hadoop-default.xml
060320 174758 parsing
file:/home/dmhadoop/hadoop-nightly/conf/hadoop-site.xml
Exception in thread "main" java.io.IOException: Cannot
create file /user/root/input/test.txt on client
DFSClient_1905685913
        at
org.apache.hadoop.ipc.Client.call(Client.java:301)
        at
org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:141)
        at
org.apache.hadoop.dfs.$Proxy0.create(Unknown Source)
        at
org.apache.hadoop.dfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:566)
        at
org.apache.hadoop.dfs.DFSClient$DFSOutputStream.<init>(DFSClient.java:547)
        at
org.apache.hadoop.dfs.DFSClient.create(DFSClient.java:98)
        at
org.apache.hadoop.dfs.DistributedFileSystem.createRaw(DistributedFileSystem.java:71)
        at
org.apache.hadoop.fs.FSDataOutputStream$Summer.<init>(FSDataOutputStream.java:39)
        at
org.apache.hadoop.fs.FSDataOutputStream.<init>(FSDataOutputStream.java:128)
        at
org.apache.hadoop.fs.FileSystem.create(FileSystem.java:180)
        at
org.apache.hadoop.fs.FileSystem.create(FileSystem.java:168)
        at
org.apache.hadoop.dfs.DistributedFileSystem.doFromLocalFile(DistributedFileSystem.java:156)
        at
org.apache.hadoop.dfs.DistributedFileSystem.doFromLocalFile(DistributedFileSystem.java:150)
        at
org.apache.hadoop.dfs.DistributedFileSystem.copyFromLocalFile(DistributedFileSystem.java:131)
        at
org.apache.hadoop.dfs.DFSShell.copyFromLocal(DFSShell.java:42)
        at
org.apache.hadoop.dfs.DFSShell.main(DFSShell.java:250)



__________________________________________________
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam
protection around 
http://mail.yahoo.com 

__________________________________________________
Do You Yahoo!?
Tired of spam?  Yahoo! Mail has the best spam protection around 
http://mail.yahoo.com