You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Cagdas Gerede <ca...@gmail.com> on 2008/03/13 18:33:32 UTC
HadoopDfsReadWriteExample
I tried HadoopDfsReadWriteExample. I am getting the following error. I
appreciate any help. I provide more info at the end.
Error while copying file
Exception in thread "main" java.io.IOException: Cannot run program
"df": CreateProcess error=2, The system cannot find the file specified
at java.lang.ProcessBuilder.start(ProcessBuilder.java:459)
at java.lang.Runtime.exec(Runtime.java:593)
at java.lang.Runtime.exec(Runtime.java:466)
at org.apache.hadoop.fs.ShellCommand.runCommand(ShellCommand.java:48)
at org.apache.hadoop.fs.ShellCommand.run(ShellCommand.java:42)
at org.apache.hadoop.fs.DF.getAvailable(DF.java:72)
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:296)
at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.createTmpFileForWrite(LocalDirAllocator.java:326)
at org.apache.hadoop.fs.LocalDirAllocator.createTmpFileForWrite(LocalDirAllocator.java:155)
at org.apache.hadoop.dfs.DFSClient$DFSOutputStream.newBackupFile(DFSClient.java:1483)
at org.apache.hadoop.dfs.DFSClient$DFSOutputStream.openBackupStream(DFSClient.java:1450)
at org.apache.hadoop.dfs.DFSClient$DFSOutputStream.writeChunk(DFSClient.java:1592)
at org.apache.hadoop.fs.FSOutputSummer.writeChecksumChunk(FSOutputSummer.java:140)
at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:122)
at org.apache.hadoop.dfs.DFSClient$DFSOutputStream.close(DFSClient.java:1728)
at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:49)
at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:64)
at HadoopDFSFileReadWrite.main(HadoopDFSFileReadWrite.java:106)
Caused by: java.io.IOException: CreateProcess error=2, The system
cannot find the file specified
at java.lang.ProcessImpl.create(Native Method)
at java.lang.ProcessImpl.<init>(ProcessImpl.java:81)
at java.lang.ProcessImpl.start(ProcessImpl.java:30)
at java.lang.ProcessBuilder.start(ProcessBuilder.java:452)
... 17 more
Note: I am on a Windows machine. The namenode is running in the same
Windows machine. The way I initialized the configuration is:
Configuration conf = new Configuration();
conf.addResource(new
Path("C:\\cygwin\\hadoop-management\\hadoop-conf\\hadoop-site.xml"));
FileSystem fs = FileSystem.get(conf);
Any suggestions?
Cagdas
Re: HadoopDfsReadWriteExample
Posted by Cagdas Gerede <ca...@gmail.com>.
For people who would have a similar problem:
I realized
org.apache.hadoop.fs.DF class documentation says
"Filesystem disk space usage statistics. Uses the unix 'df' program.
Tested on Linux, FreeBSD, Cygwin."
As a result, I run java command to run my HDFS accessing application
from Cygwin, and it worked fine.
Cagdas
On Thu, Mar 13, 2008 at 10:33 AM, Cagdas Gerede <ca...@gmail.com> wrote:
> I tried HadoopDfsReadWriteExample. I am getting the following error. I
> appreciate any help. I provide more info at the end.
>
>
> Error while copying file
> Exception in thread "main" java.io.IOException: Cannot run program
> "df": CreateProcess error=2, The system cannot find the file specified
> at java.lang.ProcessBuilder.start(ProcessBuilder.java:459)
> at java.lang.Runtime.exec(Runtime.java:593)
> at java.lang.Runtime.exec(Runtime.java:466)
> at org.apache.hadoop.fs.ShellCommand.runCommand(ShellCommand.java:48)
> at org.apache.hadoop.fs.ShellCommand.run(ShellCommand.java:42)
> at org.apache.hadoop.fs.DF.getAvailable(DF.java:72)
> at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:296)
> at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.createTmpFileForWrite(LocalDirAllocator.java:326)
> at org.apache.hadoop.fs.LocalDirAllocator.createTmpFileForWrite(LocalDirAllocator.java:155)
> at org.apache.hadoop.dfs.DFSClient$DFSOutputStream.newBackupFile(DFSClient.java:1483)
> at org.apache.hadoop.dfs.DFSClient$DFSOutputStream.openBackupStream(DFSClient.java:1450)
> at org.apache.hadoop.dfs.DFSClient$DFSOutputStream.writeChunk(DFSClient.java:1592)
> at org.apache.hadoop.fs.FSOutputSummer.writeChecksumChunk(FSOutputSummer.java:140)
> at org.apache.hadoop.fs.FSOutputSummer.flushBuffer(FSOutputSummer.java:122)
> at org.apache.hadoop.dfs.DFSClient$DFSOutputStream.close(DFSClient.java:1728)
> at org.apache.hadoop.fs.FSDataOutputStream$PositionCache.close(FSDataOutputStream.java:49)
> at org.apache.hadoop.fs.FSDataOutputStream.close(FSDataOutputStream.java:64)
> at HadoopDFSFileReadWrite.main(HadoopDFSFileReadWrite.java:106)
> Caused by: java.io.IOException: CreateProcess error=2, The system
> cannot find the file specified
> at java.lang.ProcessImpl.create(Native Method)
> at java.lang.ProcessImpl.<init>(ProcessImpl.java:81)
> at java.lang.ProcessImpl.start(ProcessImpl.java:30)
> at java.lang.ProcessBuilder.start(ProcessBuilder.java:452)
> ... 17 more
>
>
>
> Note: I am on a Windows machine. The namenode is running in the same
> Windows machine. The way I initialized the configuration is:
>
> Configuration conf = new Configuration();
> conf.addResource(new
> Path("C:\\cygwin\\hadoop-management\\hadoop-conf\\hadoop-site.xml"));
> FileSystem fs = FileSystem.get(conf);
>
>
> Any suggestions?
>
> Cagdas
>