You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by christopher pax <ch...@gmail.com> on 2008/03/13 05:13:53 UTC

hadoop dfs -ls command not working

i run something like this:
$: bin/hadoop dfs -ls /home/cloud/wordcount/input/
and get this:
ls: Could not get listing for /home/cloud/wordcount/input


the file input does exists in that directory listing

there are 2 documents in that file. file01 and file02 both which has text in it.

what i am doing is running the word count example from
http://hadoop.apache.org/core/docs/r0.16.0/mapred_tutorial.html
the program compiles fine.

running the dfs command in the example are not working.
this is not working for me either:
$: bin/hadoop jar /home/cloud/wordcount.jar org.myorg.WordCount
/home/cloud/wordcount/input /home/cloud/wordcount/output

hope you guys can help,
thanks

Re: hadoop dfs -ls command not working

Posted by Amar Kamat <am...@yahoo-inc.com>.
Hi the jar file should be on the local fs. The input/output etc should be
on the HADOOP's filesystem. So it should be something like
HADOOP-HOME/bin/hadoop jar jar-on-local-fs example input-path-on-dfs
output-on-dfs.
The jar file in the distributed mode gets shipped to the jobs from
the client machine.
Amar
On Thu, 13 Mar 2008, christopher pax wrote:

> thank you very much it worked.
> now that I have the files there I tried to run the jar file, and get this:
> $ ../hadoop-0.16.0/bin/hadoop jar /wordcount.jar org.myorg.WordCount
> /wordcount/input /wordcount/output
> Exception in thread "main" java.io.IOException: Error opening job jar:
> /wordcount.jar
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:90)
> Caused by: java.util.zip.ZipException: error in opening zip file
>         at java.util.zip.ZipFile.open(Native Method)
>         at java.util.zip.ZipFile.<init>(ZipFile.java:114)
>         at java.util.jar.JarFile.<init>(JarFile.java:133)
>         at java.util.jar.JarFile.<init>(JarFile.java:70)
>         at org.apache.hadoop.util.RunJar.main(RunJar.java:88)
>
> is there a downloadable jar file for the word count program from the tutorial.
> I am sure that im compiling it fine, I fallowed the tutorial step by
> step, and only get a error when I try to run it.
>
> On Thu, Mar 13, 2008 at 12:46 AM, Amar Kamat <am...@yahoo-inc.com> wrote:
> > Assuming that you are using HADOOP in the distributed mode.
> >
> > On Thu, 13 Mar 2008, christopher pax wrote:
> >
> >  > i run something like this:
> >  > $: bin/hadoop dfs -ls /home/cloud/wordcount/input/
> >  This path should exist in the dfs (i.e HADOOP's filesystem) and not on the
> >  local filesystem. Looking at the jar file (see below) I assume that you
> >  are trying to give it a local filesystem path. Put the file in the dfs
> >  using 'bin/hadoop dfs -put' and then provide the path in the dfs as the
> >  souce and the target. In case of 'stand alone' mode this should work.
> >  Amar
> >
> >
> > > and get this:
> >  > ls: Could not get listing for /home/cloud/wordcount/input
> >  >
> >  >
> >  > the file input does exists in that directory listing
> >  >
> >  > there are 2 documents in that file. file01 and file02 both which has text in it.
> >  >
> >  > what i am doing is running the word count example from
> >  > http://hadoop.apache.org/core/docs/r0.16.0/mapred_tutorial.html
> >  > the program compiles fine.
> >  >
> >  > running the dfs command in the example are not working.
> >  > this is not working for me either:
> >  > $: bin/hadoop jar /home/cloud/wordcount.jar org.myorg.WordCount
> >                     ^ ^ ^ ^ ^ ^
> >  > /home/cloud/wordcount/input /home/cloud/wordcount/output
> >  >
> >  > hope you guys can help,
> >  > thanks
> >  >
> >
>

Re: hadoop dfs -ls command not working

Posted by christopher pax <ch...@gmail.com>.
thank you very much it worked.
now that I have the files there I tried to run the jar file, and get this:
$ ../hadoop-0.16.0/bin/hadoop jar /wordcount.jar org.myorg.WordCount
/wordcount/input /wordcount/output
Exception in thread "main" java.io.IOException: Error opening job jar:
/wordcount.jar
        at org.apache.hadoop.util.RunJar.main(RunJar.java:90)
Caused by: java.util.zip.ZipException: error in opening zip file
        at java.util.zip.ZipFile.open(Native Method)
        at java.util.zip.ZipFile.<init>(ZipFile.java:114)
        at java.util.jar.JarFile.<init>(JarFile.java:133)
        at java.util.jar.JarFile.<init>(JarFile.java:70)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:88)

is there a downloadable jar file for the word count program from the tutorial.
I am sure that im compiling it fine, I fallowed the tutorial step by
step, and only get a error when I try to run it.

On Thu, Mar 13, 2008 at 12:46 AM, Amar Kamat <am...@yahoo-inc.com> wrote:
> Assuming that you are using HADOOP in the distributed mode.
>
> On Thu, 13 Mar 2008, christopher pax wrote:
>
>  > i run something like this:
>  > $: bin/hadoop dfs -ls /home/cloud/wordcount/input/
>  This path should exist in the dfs (i.e HADOOP's filesystem) and not on the
>  local filesystem. Looking at the jar file (see below) I assume that you
>  are trying to give it a local filesystem path. Put the file in the dfs
>  using 'bin/hadoop dfs -put' and then provide the path in the dfs as the
>  souce and the target. In case of 'stand alone' mode this should work.
>  Amar
>
>
> > and get this:
>  > ls: Could not get listing for /home/cloud/wordcount/input
>  >
>  >
>  > the file input does exists in that directory listing
>  >
>  > there are 2 documents in that file. file01 and file02 both which has text in it.
>  >
>  > what i am doing is running the word count example from
>  > http://hadoop.apache.org/core/docs/r0.16.0/mapred_tutorial.html
>  > the program compiles fine.
>  >
>  > running the dfs command in the example are not working.
>  > this is not working for me either:
>  > $: bin/hadoop jar /home/cloud/wordcount.jar org.myorg.WordCount
>                     ^ ^ ^ ^ ^ ^
>  > /home/cloud/wordcount/input /home/cloud/wordcount/output
>  >
>  > hope you guys can help,
>  > thanks
>  >
>

Re: hadoop dfs -ls command not working

Posted by Amar Kamat <am...@yahoo-inc.com>.
Assuming that you are using HADOOP in the distributed mode.
On Thu, 13 Mar 2008, christopher pax wrote:

> i run something like this:
> $: bin/hadoop dfs -ls /home/cloud/wordcount/input/
This path should exist in the dfs (i.e HADOOP's filesystem) and not on the
local filesystem. Looking at the jar file (see below) I assume that you
are trying to give it a local filesystem path. Put the file in the dfs
using 'bin/hadoop dfs -put' and then provide the path in the dfs as the
souce and the target. In case of 'stand alone' mode this should work.
Amar
> and get this:
> ls: Could not get listing for /home/cloud/wordcount/input
>
>
> the file input does exists in that directory listing
>
> there are 2 documents in that file. file01 and file02 both which has text in it.
>
> what i am doing is running the word count example from
> http://hadoop.apache.org/core/docs/r0.16.0/mapred_tutorial.html
> the program compiles fine.
>
> running the dfs command in the example are not working.
> this is not working for me either:
> $: bin/hadoop jar /home/cloud/wordcount.jar org.myorg.WordCount
                    ^ ^ ^ ^ ^ ^
> /home/cloud/wordcount/input /home/cloud/wordcount/output
>
> hope you guys can help,
> thanks
>