You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-user@hadoop.apache.org by Sandy <sn...@gmail.com> on 2008/06/23 21:40:20 UTC

trouble setting up hadoop

I apologize for the severe basicness of this error, but I am in the process
of getting  hadoop set up. I have been following the instructions in the
Hadoop quickstart. I have confirmed that bin/hadoop will give me help usage
information.

I am now in the stage of standalone operation.

I typed in:
mkdir input
cp conf/*.xml input
bin/hadoop jar hadoop-*-examples.jar grep input output 'dfs[a-z.]+'

at which point I get:
Exception in thread "main" java.lang.ClassNotFoundException:
java.lang.Iterable not found in
gnu.gcj.runtime.SystemClassLoader{urls=[file:/home/sjm/Desktop/hado
op-0.16.4/bin/../conf/,file:/home/sjm/Desktop/hadoop-0.16.4/bin/../,file:/home/s
jm/Desktop/hadoop-0.16.4/bin/../hadoop-0.16.4-core.jar,file:/home/sjm/Desktop/ha
doop-0.16.4/bin/../lib/commons-cli-2.0-SNAPSHOT.jar,file:/home/sjm/Desktop/hadoo
p-0.16.4/bin/../lib/commons-codec-1.3.jar,file:/home/sjm/Desktop/hadoop-0.16.4/b
in/../lib/commons-httpclient-3.0.1.jar,file:/home/sjm/Desktop/hadoop-0.16.4/bin/
../lib/commons-logging-1.0.4.jar,file:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib
/commons-logging-api-1.0.4.jar,file:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/j
ets3t-0.5.0.jar,file:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/jetty-5.1.4.jar,
file:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/junit-3.8.1.jar,file:/home/sjm/D
esktop/hadoop-0.16.4/bin/../lib/kfs-0.1.jar,file:/home/sjm/Desktop/hadoop-0.16.4
/bin/../lib/log4j-1.2.13.jar,file:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/ser
vlet-api.jar,file:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/xmlenc-0.52.jar,fil
e:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/jetty-ext/commons-el.jar,file:/home
/sjm/Desktop/hadoop-0.16.4/bin/../lib/jetty-ext/jasper-compiler.jar,file:/home/s
jm/Desktop/hadoop-0.16.4/bin/../lib/jetty-ext/jasper-runtime.jar,file:/home/sjm/
Desktop/hadoop-0.16.4/bin/../lib/jetty-ext/jsp-api.jar],
parent=gnu.gcj.runtime. ExtensionClassLoader{urls=[], parent=null}}
   at java.net.URLClassLoader.findClass (libgcj.so.7)
   at java.lang.ClassLoader.loadClass (libgcj.so.7)
   at java.lang.ClassLoader.loadClass (libgcj.so.7)
   at java.lang.VMClassLoader.defineClass (libgcj.so.7)
   at java.lang.ClassLoader.defineClass (libgcj.so.7)
   at java.security.SecureClassLoader.defineClass (libgcj.so.7)
   at java.net.URLClassLoader.findClass (libgcj.so.7)
   at java.lang.ClassLoader.loadClass (libgcj.so.7)
   at java.lang.ClassLoader.loadClass (libgcj.so.7)
   at org.apache.hadoop.util.RunJar.main (RunJar.java:107)

I suspect the issue is path related, though I am not certain. Could someone
please point me in the right direction?

Much thanks,

SM

Re: trouble setting up hadoop

Posted by Sandy <sn...@gmail.com>.
Hi Stefan,

I think that did it. When I type in
java -version
I now get:
java version "1.6.0_06"
Java(TM) SE Runtime Environment (build 1.6.0_06-b02)
Java HotSpot(TM) Client VM (build 10.0-b22, mixed mode, sharing)

And, when I run:
bin/hadoop jar hadoop-*-examples.jar grep input output 'dfs[a-z]+'

I get:
08/06/23 17:03:12 INFO jvm.JvmMetrics: Initializing JVM Metrics with
processName=JobTracker, sessionId=
08/06/23 17:03:13 INFO mapred.FileInputFormat: Total input paths to process
: 2
08/06/23 17:03:13 INFO mapred.JobClient: Running job: job_local_1
08/06/23 17:03:13 INFO mapred.MapTask: numReduceTasks: 1
08/06/23 17:03:13 INFO mapred.LocalJobRunner:
file:/home/sjm/Desktop/hadoop-0.16.4/input/hadoop-site.xml:0+178
08/06/23 17:03:13 INFO mapred.TaskRunner: Task 'job_local_1_map_0000' done.
08/06/23 17:03:13 INFO mapred.TaskRunner: Saved output of task
'job_local_1_map_0000' to
file:/home/sjm/Desktop/hadoop-0.16.4/grep-temp-1561747821
08/06/23 17:03:13 INFO mapred.MapTask: numReduceTasks: 1
08/06/23 17:03:13 INFO mapred.LocalJobRunner:
file:/home/sjm/Desktop/hadoop-0.16.4/input/hadoop-default.xml:0+34064
08/06/23 17:03:13 INFO mapred.TaskRunner: Task 'job_local_1_map_0001' done.
08/06/23 17:03:13 INFO mapred.TaskRunner: Saved output of task
'job_local_1_map_0001' to
file:/home/sjm/Desktop/hadoop-0.16.4/grep-temp-1561747821
08/06/23 17:03:13 INFO mapred.LocalJobRunner: reduce > reduce
08/06/23 17:03:13 INFO mapred.TaskRunner: Task 'reduce_ov0kiq' done.
08/06/23 17:03:13 INFO mapred.TaskRunner: Saved output of task
'reduce_ov0kiq' to file:/home/sjm/Desktop/hadoop-0.16.4/grep-temp-1561747821
08/06/23 17:03:14 INFO mapred.JobClient: Job complete: job_local_1
08/06/23 17:03:14 INFO mapred.JobClient: Counters: 9
08/06/23 17:03:14 INFO mapred.JobClient:   Map-Reduce Framework
08/06/23 17:03:14 INFO mapred.JobClient:     Map input records=1125
08/06/23 17:03:14 INFO mapred.JobClient:     Map output records=0
08/06/23 17:03:14 INFO mapred.JobClient:     Map input bytes=34242
08/06/23 17:03:14 INFO mapred.JobClient:     Map output bytes=0
08/06/23 17:03:14 INFO mapred.JobClient:     Combine input records=0
08/06/23 17:03:14 INFO mapred.JobClient:     Combine output records=0
08/06/23 17:03:14 INFO mapred.JobClient:     Reduce input groups=0
08/06/23 17:03:14 INFO mapred.JobClient:     Reduce input records=0
08/06/23 17:03:14 INFO mapred.JobClient:     Reduce output records=0
08/06/23 17:03:14 INFO jvm.JvmMetrics: Cannot initialize JVM Metrics with
processName=JobTracker, sessionId= - already initialized
08/06/23 17:03:14 INFO mapred.FileInputFormat: Total input paths to process
: 1
08/06/23 17:03:14 INFO mapred.JobClient: Running job: job_local_2
08/06/23 17:03:14 INFO mapred.MapTask: numReduceTasks: 1
08/06/23 17:03:14 INFO mapred.LocalJobRunner:
file:/home/sjm/Desktop/hadoop-0.16.4/grep-temp-1561747821/part-00000:0+86
08/06/23 17:03:14 INFO mapred.TaskRunner: Task 'job_local_2_map_0000' done.
08/06/23 17:03:14 INFO mapred.TaskRunner: Saved output of task
'job_local_2_map_0000' to file:/home/sjm/Desktop/hadoop-0.16.4/output
08/06/23 17:03:14 INFO mapred.LocalJobRunner: reduce > reduce
08/06/23 17:03:14 INFO mapred.TaskRunner: Task 'reduce_448bva' done.
08/06/23 17:03:14 INFO mapred.TaskRunner: Saved output of task
'reduce_448bva' to file:/home/sjm/Desktop/hadoop-0.16.4/output
08/06/23 17:03:15 INFO mapred.JobClient: Job complete: job_local_2
08/06/23 17:03:15 INFO mapred.JobClient: Counters: 9
08/06/23 17:03:15 INFO mapred.JobClient:   Map-Reduce Framework
08/06/23 17:03:15 INFO mapred.JobClient:     Map input records=0
08/06/23 17:03:15 INFO mapred.JobClient:     Map output records=0
08/06/23 17:03:15 INFO mapred.JobClient:     Map input bytes=0
08/06/23 17:03:15 INFO mapred.JobClient:     Map output bytes=0
08/06/23 17:03:15 INFO mapred.JobClient:     Combine input records=0
08/06/23 17:03:15 INFO mapred.JobClient:     Combine output records=0
08/06/23 17:03:15 INFO mapred.JobClient:     Reduce input groups=0
08/06/23 17:03:15 INFO mapred.JobClient:     Reduce input records=0
08/06/23 17:03:15 INFO mapred.JobClient:     Reduce output records=0

Does this all look correct? If so, thank you so much. I really appreciate
all the help!

-SM

On Mon, Jun 23, 2008 at 4:32 PM, Stefan Groschupf <sg...@101tec.com> wrote:

> Looks like you have not install a correct java.
> Make sure you have a sun java installed on your nodes and java is in your
> path as well JAVA_HOME should be set.
> I think gnu.gcj is the gnu java compiler but not a java you need to run
> hadoop.
> Check on command line this:
> $ java -version
> you should see something like this:
> java version "1.5.0_13"
> Java(TM) 2 Runtime Environment, Standard Edition (build 1.5.0_13-b05-237)
> Java HotSpot(TM) Client VM (build 1.5.0_13-119, mixed mode, sharing)
>
> HTH
>
>
>
> On Jun 23, 2008, at 9:40 PM, Sandy wrote:
>
>  I apologize for the severe basicness of this error, but I am in the
>> process
>> of getting  hadoop set up. I have been following the instructions in the
>> Hadoop quickstart. I have confirmed that bin/hadoop will give me help
>> usage
>> information.
>>
>> I am now in the stage of standalone operation.
>>
>> I typed in:
>> mkdir input
>> cp conf/*.xml input
>> bin/hadoop jar hadoop-*-examples.jar grep input output 'dfs[a-z.]+'
>>
>> at which point I get:
>> Exception in thread "main" java.lang.ClassNotFoundException:
>> java.lang.Iterable not found in
>> gnu.gcj.runtime.SystemClassLoader{urls=[file:/home/sjm/Desktop/hado
>>
>> op-0.16.4/bin/../conf/,file:/home/sjm/Desktop/hadoop-0.16.4/bin/../,file:/home/s
>>
>> jm/Desktop/hadoop-0.16.4/bin/../hadoop-0.16.4-core.jar,file:/home/sjm/Desktop/ha
>>
>> doop-0.16.4/bin/../lib/commons-cli-2.0-SNAPSHOT.jar,file:/home/sjm/Desktop/hadoo
>>
>> p-0.16.4/bin/../lib/commons-codec-1.3.jar,file:/home/sjm/Desktop/hadoop-0.16.4/b
>>
>> in/../lib/commons-httpclient-3.0.1.jar,file:/home/sjm/Desktop/hadoop-0.16.4/bin/
>>
>> ../lib/commons-logging-1.0.4.jar,file:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib
>>
>> /commons-logging-api-1.0.4.jar,file:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/j
>>
>> ets3t-0.5.0.jar,file:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/jetty-5.1.4.jar,
>>
>> file:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/junit-3.8.1.jar,file:/home/sjm/D
>>
>> esktop/hadoop-0.16.4/bin/../lib/kfs-0.1.jar,file:/home/sjm/Desktop/hadoop-0.16.4
>>
>> /bin/../lib/log4j-1.2.13.jar,file:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/ser
>>
>> vlet-api.jar,file:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/xmlenc-0.52.jar,fil
>>
>> e:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/jetty-ext/commons-el.jar,file:/home
>>
>> /sjm/Desktop/hadoop-0.16.4/bin/../lib/jetty-ext/jasper-compiler.jar,file:/home/s
>>
>> jm/Desktop/hadoop-0.16.4/bin/../lib/jetty-ext/jasper-runtime.jar,file:/home/sjm/
>> Desktop/hadoop-0.16.4/bin/../lib/jetty-ext/jsp-api.jar],
>> parent=gnu.gcj.runtime. ExtensionClassLoader{urls=[], parent=null}}
>>  at java.net.URLClassLoader.findClass (libgcj.so.7)
>>  at java.lang.ClassLoader.loadClass (libgcj.so.7)
>>  at java.lang.ClassLoader.loadClass (libgcj.so.7)
>>  at java.lang.VMClassLoader.defineClass (libgcj.so.7)
>>  at java.lang.ClassLoader.defineClass (libgcj.so.7)
>>  at java.security.SecureClassLoader.defineClass (libgcj.so.7)
>>  at java.net.URLClassLoader.findClass (libgcj.so.7)
>>  at java.lang.ClassLoader.loadClass (libgcj.so.7)
>>  at java.lang.ClassLoader.loadClass (libgcj.so.7)
>>  at org.apache.hadoop.util.RunJar.main (RunJar.java:107)
>>
>> I suspect the issue is path related, though I am not certain. Could
>> someone
>> please point me in the right direction?
>>
>> Much thanks,
>>
>> SM
>>
>
> ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> 101tec Inc.
> Menlo Park, California, USA
> http://www.101tec.com
>
>
>

Re: trouble setting up hadoop

Posted by Stefan Groschupf <sg...@101tec.com>.
Looks like you have not install a correct java.
Make sure you have a sun java installed on your nodes and java is in  
your path as well JAVA_HOME should be set.
I think gnu.gcj is the gnu java compiler but not a java you need to  
run hadoop.
Check on command line this:
$ java -version
you should see something like this:
java version "1.5.0_13"
Java(TM) 2 Runtime Environment, Standard Edition (build 1.5.0_13- 
b05-237)
Java HotSpot(TM) Client VM (build 1.5.0_13-119, mixed mode, sharing)

HTH


On Jun 23, 2008, at 9:40 PM, Sandy wrote:

> I apologize for the severe basicness of this error, but I am in the  
> process
> of getting  hadoop set up. I have been following the instructions in  
> the
> Hadoop quickstart. I have confirmed that bin/hadoop will give me  
> help usage
> information.
>
> I am now in the stage of standalone operation.
>
> I typed in:
> mkdir input
> cp conf/*.xml input
> bin/hadoop jar hadoop-*-examples.jar grep input output 'dfs[a-z.]+'
>
> at which point I get:
> Exception in thread "main" java.lang.ClassNotFoundException:
> java.lang.Iterable not found in
> gnu.gcj.runtime.SystemClassLoader{urls=[file:/home/sjm/Desktop/hado
> op-0.16.4/bin/../conf/,file:/home/sjm/Desktop/hadoop-0.16.4/ 
> bin/../,file:/home/s
> jm/Desktop/hadoop-0.16.4/bin/../hadoop-0.16.4-core.jar,file:/home/ 
> sjm/Desktop/ha
> doop-0.16.4/bin/../lib/commons-cli-2.0-SNAPSHOT.jar,file:/home/sjm/ 
> Desktop/hadoo
> p-0.16.4/bin/../lib/commons-codec-1.3.jar,file:/home/sjm/Desktop/ 
> hadoop-0.16.4/b
> in/../lib/commons-httpclient-3.0.1.jar,file:/home/sjm/Desktop/ 
> hadoop-0.16.4/bin/
> ../lib/commons-logging-1.0.4.jar,file:/home/sjm/Desktop/ 
> hadoop-0.16.4/bin/../lib
> /commons-logging-api-1.0.4.jar,file:/home/sjm/Desktop/hadoop-0.16.4/ 
> bin/../lib/j
> ets3t-0.5.0.jar,file:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/ 
> jetty-5.1.4.jar,
> file:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/ 
> junit-3.8.1.jar,file:/home/sjm/D
> esktop/hadoop-0.16.4/bin/../lib/kfs-0.1.jar,file:/home/sjm/Desktop/ 
> hadoop-0.16.4
> /bin/../lib/log4j-1.2.13.jar,file:/home/sjm/Desktop/hadoop-0.16.4/ 
> bin/../lib/ser
> vlet-api.jar,file:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/ 
> xmlenc-0.52.jar,fil
> e:/home/sjm/Desktop/hadoop-0.16.4/bin/../lib/jetty-ext/commons- 
> el.jar,file:/home
> /sjm/Desktop/hadoop-0.16.4/bin/../lib/jetty-ext/jasper- 
> compiler.jar,file:/home/s
> jm/Desktop/hadoop-0.16.4/bin/../lib/jetty-ext/jasper- 
> runtime.jar,file:/home/sjm/
> Desktop/hadoop-0.16.4/bin/../lib/jetty-ext/jsp-api.jar],
> parent=gnu.gcj.runtime. ExtensionClassLoader{urls=[], parent=null}}
>   at java.net.URLClassLoader.findClass (libgcj.so.7)
>   at java.lang.ClassLoader.loadClass (libgcj.so.7)
>   at java.lang.ClassLoader.loadClass (libgcj.so.7)
>   at java.lang.VMClassLoader.defineClass (libgcj.so.7)
>   at java.lang.ClassLoader.defineClass (libgcj.so.7)
>   at java.security.SecureClassLoader.defineClass (libgcj.so.7)
>   at java.net.URLClassLoader.findClass (libgcj.so.7)
>   at java.lang.ClassLoader.loadClass (libgcj.so.7)
>   at java.lang.ClassLoader.loadClass (libgcj.so.7)
>   at org.apache.hadoop.util.RunJar.main (RunJar.java:107)
>
> I suspect the issue is path related, though I am not certain. Could  
> someone
> please point me in the right direction?
>
> Much thanks,
>
> SM

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
101tec Inc.
Menlo Park, California, USA
http://www.101tec.com