You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by velo0001 <ve...@myid.me> on 2009/08/12 21:36:15 UTC

Re: Exception in rowcount program



Erik Holstad wrote:
> 
>>  you followed the instructions on
> http://wiki.apache.org/hadoop/Hbase/MapReduce
> and it didn't work for you?
> 

I followed exactly the instructions given above. But I am getting the same
type  of error for my map reduce, a class not found. 

(jar –tfv shows me that the ImmutableBytesWritable class does exist in the
0.20 hadoop jar - I am using 0.20 version)

Here is my export in hadoop-env.sh:

export
HADOOP_CLASSPATH=/usr/hbase_install/hbase-0.20.0/hbase-0.20.0.jar:/usr/hbase_install/hbase-0.20.0/hbase-0.20.0-test.jar:/usr/hbase_install/hbase-0.20.0/hbase/conf

(I could try copying the hadoop 0.20 jar and the hbase-site.xml into my
hadoop conf dir, but I really don’t want to use that approach)

Any ideas why I continue to get this error:


DEBUG 14:53:49.536 main org.apache.hadoop.mapred.JobClient(776) - Creating
splits at
hdfs://devdkvstore.headquarters.socketware.com/tmp/hadoop-root/mapred/system/job_200908121217_0003/job.split
 INFO 14:53:49.552 main org.apache.hadoop.mapred.FileInputFormat(192) -
Total input paths to process : 1
DEBUG 14:53:49.770 main org.apache.hadoop.mapred.FileInputFormat(248) -
Total # of splits: 2
 INFO 14:53:50.923 main org.apache.hadoop.mapred.JobClient(1278) - Running
job: job_200908121217_0003
 INFO 14:53:51.936 main org.apache.hadoop.mapred.JobClient(1291) -  map 0%
reduce 0%
 INFO 14:54:06.802 main org.apache.hadoop.mapred.JobClient(1320) - Task Id :
attempt_200908121217_0003_m_000000_0, Status : FAILED
java.lang.RuntimeException: java.lang.RuntimeException:
java.lang.ClassNotFoundException:
org.apache.hadoop.hbase.io.ImmutableBytesWritable
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:840)
	at org.apache.hadoop.mapred.JobConf.getMapOutputKeyClass(JobConf.java:590)
	at
org.apache.hadoop.mapred.JobConf.getOutputKeyComparator(JobConf.java:664)
	at
org.apache.hadoop.mapred.MapTask$MapOutputBuffer.<init>(MapTask.java:689)
	at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:348)
	at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
	at org.apache.hadoop.mapred.Child.main(Child.java:170)
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException:
org.apache.hadoop.hbase.io.ImmutableBytesWritable
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:808)
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:832)
	... 6 more
Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.hbase.io.ImmutableBytesWritable
	at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
	at java.security.AccessController.doPrivileged(Native Method)
	at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
	at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
	at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
	at java.lang.Class.forName0(Native Method)
	at java.lang.Class.forName(Class.java:247)
	at
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:761)
	at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:806)
	... 7 more



-- 
View this message in context: http://www.nabble.com/Exception-in-rowcount-program-tp23195921p24943168.html
Sent from the HBase User mailing list archive at Nabble.com.


Re: Exception in rowcount program

Posted by Erik Holstad <er...@gmail.com>.
Hey!
If you are dealing with HBase + Hadoop 0.20 you should have a look at:
http://hadoop.apache.org/hbase/docs/current/api/org/apache/hadoop/hbase/mapred/package-summary.html#package_description
ImmutableBytesWritable is a part of HBase and not Hadoop, so maybe you
missed something when linking HBase to your MR job.

Regards Erik

Re: Exception in rowcount program

Posted by Schubert Zhang <zs...@gmail.com>.
This is my setting:

hadoop-env.sh

export
HADOOP_CLASSPATH=${HADOOP_HOME}/../hbase-0.20.0/hbase-0.20.0.jar:${HADOOP_HOME}/../hbase-0.20.0/conf:${HADOOP_HOME}/../hbase-0.20.0/lib/zookeeper-r785019-hbase-1329.jar
hbase-env.sh

export HBASE_CLASSPATH=${HBASE_HOME}/../hadoop-0.20.0/conf




On Fri, Aug 14, 2009 at 3:57 AM, velo0001 <ve...@myid.me> wrote:

>
>
>
> stack-3 wrote:
> >
> >
> >>>    Running a MR
> >>>job, participating classes must be available on the CLASSPATH on all
> nodes,
> >>>not just client.
> >>>
> >>>My guess is that running from eclipse you were not fulfilling the latter
> >>>requirement.
> >>>
> >>>Did you try running the rowcounter program from the command-line?
> >>>
> >>>St.Ack
> >
> >
>
> Hi St.Ack,
>
> The purpose of adding the HBase jar to the Hadoop classpath export was to
> make HBase class availabliity known and usable to Hadoop. That of course is
> on the Linux machine where both Hadoop and HBase are running. That was the
> means supposed to fulfill the HBase classes availablity you mention above.
> It did not seem to work. But, physically copying the HBase jar over to the
> Hadoop deployment DID work. Again, on the Linux machine, not on the windows
> client machine. Running a client process on a remote windows machine did
> not
> generate the error in any way, or it would not have been fixed by the
> physical jar copy to Hadoop on the server machine. And that did fix it. So
> the real question I see is why did the change to the HADOOP_CLASSPATH
> export, on the Hadoop deployment, NOT work as it was supposed to? That is
> maybe the thing which needs to be looked at.
>
> No, I did not try running the rowcounter code.
>
> Thanks for the response, and I hope this makes the issue clearer.
>
> (So, right now I am applying the 660 shim patches to PIG 0.3.0 - hoping to
> get that running tomorrow. )
>
> _mc
>
>
>
> --
> View this message in context:
> http://www.nabble.com/Exception-in-rowcount-program-tp23195921p24957698.html
>  Sent from the HBase User mailing list archive at Nabble.com.
>
>

Re: Exception in rowcount program

Posted by velo0001 <ve...@myid.me>.


stack-3 wrote:
> 
> 
>>>    Running a MR
>>>job, participating classes must be available on the CLASSPATH on all
nodes,
>>>not just client.
>>>
>>>My guess is that running from eclipse you were not fulfilling the latter
>>>requirement.
>>>
>>>Did you try running the rowcounter program from the command-line?
>>>
>>>St.Ack
> 
> 

Hi St.Ack,

The purpose of adding the HBase jar to the Hadoop classpath export was to
make HBase class availabliity known and usable to Hadoop. That of course is
on the Linux machine where both Hadoop and HBase are running. That was the
means supposed to fulfill the HBase classes availablity you mention above. 
It did not seem to work. But, physically copying the HBase jar over to the
Hadoop deployment DID work. Again, on the Linux machine, not on the windows
client machine. Running a client process on a remote windows machine did not
generate the error in any way, or it would not have been fixed by the
physical jar copy to Hadoop on the server machine. And that did fix it. So
the real question I see is why did the change to the HADOOP_CLASSPATH
export, on the Hadoop deployment, NOT work as it was supposed to? That is
maybe the thing which needs to be looked at. 

No, I did not try running the rowcounter code. 

Thanks for the response, and I hope this makes the issue clearer. 

(So, right now I am applying the 660 shim patches to PIG 0.3.0 - hoping to
get that running tomorrow. )

_mc



-- 
View this message in context: http://www.nabble.com/Exception-in-rowcount-program-tp23195921p24957698.html
Sent from the HBase User mailing list archive at Nabble.com.


Re: Exception in rowcount program

Posted by stack <st...@duboce.net>.
On Thu, Aug 13, 2009 at 3:59 AM, velo0001 <ve...@myid.me> wrote:

>
> I am running it in Eclipse on a windows machine client, going to a Linux
> machine where the Hadoop 0.20.0 DFS is deployed and the HBase 0.20.0 is
> deployed. I have used this WIN client machine for inserting row columns (up
> to a million rows with 3 columns each so far in my testing) and reading
> back
> existing row/columns, so I am relatively sure that all my conf setups on
> the
> windows client and the Linux server are correct. Everything else seems to
> run OK for me, except map reduce.
>

Inserting rows, client just needs to be configured properly.  Running a MR
job, participating classes must be available on the CLASSPATH on all nodes,
not just client.

My guess is that running from eclipse you were not fulfilling the latter
requirement.

Did you try running the rowcounter program from the command-line?

St.Ack


>
> Maybe map reduce processes are not designed to be run this way?
>
> And for the reply above yours, that package info is exactly what I used
> when
> setting things up. Look at my HADOOP_CLASSPATH export - it is what is shown
> in that information. And I have dbl checked the pathing and it is all
> correct for the Linux server that Hadoop and HBase are on.
>
> I am about to start playing with PIG, but I need to get map reduce working
> too.
>
> I will probably try copying the hbase-site.xml and hbase jar file into
> hadoop location, but like I said I really don't want that to be the answer.
>
> THx, for any help....
>
> _mc
>
>
>
> stack-3 wrote:
> >
> > How are you running it?
> >
> > $ ./bin/hadoop jar hbase.jar...
> >
> > .. or some other way?
> >
> > St.Ack
> >
> > On Wed, Aug 12, 2009 at 12:36 PM, velo0001 <ve...@myid.me> wrote:
> >
> >>
> >>
> >>
> >> Erik Holstad wrote:
> >> >
> >> >>  you followed the instructions on
> >> > http://wiki.apache.org/hadoop/Hbase/MapReduce
> >> > and it didn't work for you?
> >> >
> >>
> >> I followed exactly the instructions given above. But I am getting the
> >> same
> >> type  of error for my map reduce, a class not found.
> >>
> >> (jar –tfv shows me that the ImmutableBytesWritable class does exist in
> >> the
> >> 0.20 hadoop jar - I am using 0.20 version)
> >>
> >> Here is my export in hadoop-env.sh:
> >>
> >> export
> >>
> >>
> HADOOP_CLASSPATH=/usr/hbase_install/hbase-0.20.0/hbase-0.20.0.jar:/usr/hbase_install/hbase-0.20.0/hbase-0.20.0-test.jar:/usr/hbase_install/hbase-0.20.0/hbase/conf
> >>
> >> (I could try copying the hadoop 0.20 jar and the hbase-site.xml into my
> >> hadoop conf dir, but I really don’t want to use that approach)
> >>
> >> Any ideas why I continue to get this error:
> >>
> >>
> >> DEBUG 14:53:49.536 main org.apache.hadoop.mapred.JobClient(776) -
> >> Creating
> >> splits at
> >> hdfs://
> >>
> devdkvstore.headquarters.socketware.com/tmp/hadoop-root/mapred/system/job_200908121217_0003/job.split
> >>  INFO 14:53:49.552 main org.apache.hadoop.mapred.FileInputFormat(192) -
> >> Total input paths to process : 1
> >> DEBUG 14:53:49.770 main org.apache.hadoop.mapred.FileInputFormat(248) -
> >> Total # of splits: 2
> >>  INFO 14:53:50.923 main org.apache.hadoop.mapred.JobClient(1278) -
> >> Running
> >> job: job_200908121217_0003
> >>  INFO 14:53:51.936 main org.apache.hadoop.mapred.JobClient(1291) -  map
> >> 0%
> >> reduce 0%
> >>  INFO 14:54:06.802 main org.apache.hadoop.mapred.JobClient(1320) - Task
> >> Id
> >> :
> >> attempt_200908121217_0003_m_000000_0, Status : FAILED
> >> java.lang.RuntimeException: java.lang.RuntimeException:
> >> java.lang.ClassNotFoundException:
> >> org.apache.hadoop.hbase.io.ImmutableBytesWritable
> >>        at
> >> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:840)
> >>        at
> >> org.apache.hadoop.mapred.JobConf.getMapOutputKeyClass(JobConf.java:590)
> >>        at
> >>
> org.apache.hadoop.mapred.JobConf.getOutputKeyComparator(JobConf.java:664)
> >>        at
> >>
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.<init>(MapTask.java:689)
> >>        at
> org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:348)
> >>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
> >>        at org.apache.hadoop.mapred.Child.main(Child.java:170)
> >> Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException:
> >> org.apache.hadoop.hbase.io.ImmutableBytesWritable
> >>        at
> >> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:808)
> >>        at
> >> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:832)
> >>        ... 6 more
> >> Caused by: java.lang.ClassNotFoundException:
> >> org.apache.hadoop.hbase.io.ImmutableBytesWritable
> >>        at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
> >>        at java.security.AccessController.doPrivileged(Native Method)
> >>        at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
> >>        at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
> >>        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
> >>        at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
> >>        at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
> >>        at java.lang.Class.forName0(Native Method)
> >>        at java.lang.Class.forName(Class.java:247)
> >>        at
> >>
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:761)
> >>        at
> >> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:806)
> >>        ... 7 more
> >>
> >>
> >>
> >> --
> >> View this message in context:
> >>
> http://www.nabble.com/Exception-in-rowcount-program-tp23195921p24943168.html
> >> Sent from the HBase User mailing list archive at Nabble.com.
> >>
> >>
> >
> >
>
> --
> View this message in context:
> http://www.nabble.com/Exception-in-rowcount-program-tp23195921p24952886.html
> Sent from the HBase User mailing list archive at Nabble.com.
>
>

Re: Exception in rowcount program

Posted by velo0001 <ve...@myid.me>.
Copying the hbase-0.20.0.jar and the hbase-site.xml files into the Hadoop
deployment location DID solve the above class not found issue. 

Then it showed up that I also needed the zookeeper jar file in HBase
available to Hadoop as well. 

This time modifying the HADOOP_CLASSPATH in the *env.sh DID allow running
Hadoop to resolve zoo keeper classes, as it did NOT with HBase classes. To
repeat, I did NOT need to copy the zoo keeper jar file into Hadoop location,
as I did have to with the HBase jar.

So, FWIW, the above is my experience. And it seems to mirror the original
posters experience.

The MapReduce java code process runs fine now:

DEBUG 10:02:52.449 org.apache.hadoop.mapred.JobClient (JobClient.java:776) -
Creating splits at
hdfs://devdkvstore.headquarters.socketware.com/tmp/hadoop-root/mapred/system/job_200908130817_0001/job.split
 INFO 10:02:52.465 org.apache.hadoop.mapred.FileInputFormat
(FileInputFormat.java:192) - Total input paths to process : 1
DEBUG 10:02:52.480 org.apache.hadoop.mapred.FileInputFormat
(FileInputFormat.java:248) - Total # of splits: 2
 INFO 10:02:53.073 org.apache.hadoop.mapred.JobClient (JobClient.java:1278)
- Running job: job_200908130817_0001
 INFO 10:02:54.088 org.apache.hadoop.mapred.JobClient (JobClient.java:1291)
-  map 0% reduce 0%
 INFO 10:03:16.459 org.apache.hadoop.mapred.JobClient (JobClient.java:1291)
-  map 100% reduce 0%
 INFO 10:03:24.779 org.apache.hadoop.mapred.JobClient (JobClient.java:1291)
-  map 100% reduce 100%
 INFO 10:03:26.793 org.apache.hadoop.mapred.JobClient (JobClient.java:1346)
- Job complete: job_200908130817_0001
DEBUG 10:03:26.809 org.apache.hadoop.mapred.Counters (Counters.java:151) -
Creating group org.apache.hadoop.mapred.JobInProgress$Counter with bundle
DEBUG 10:03:26.809 org.apache.hadoop.mapred.Counters (Counters.java:151) -
Creating group FileSystemCounters with nothing
DEBUG 10:03:26.809 org.apache.hadoop.mapred.Counters (Counters.java:151) -
Creating group org.apache.hadoop.mapred.Task$Counter with bundle
 INFO 10:03:26.809 org.apache.hadoop.mapred.JobClient (Counters.java:514) -
Counters: 17
 INFO 10:03:26.809 org.apache.hadoop.mapred.JobClient (Counters.java:516) -  
Job Counters 
 INFO 10:03:26.809 org.apache.hadoop.mapred.JobClient (Counters.java:518) -    
Launched reduce tasks=1
 INFO 10:03:26.809 org.apache.hadoop.mapred.JobClient (Counters.java:518) -    
Launched map tasks=2
 INFO 10:03:26.809 org.apache.hadoop.mapred.JobClient (Counters.java:518) -    
Data-local map tasks=2
 INFO 10:03:26.809 org.apache.hadoop.mapred.JobClient (Counters.java:516) -  
FileSystemCounters
 INFO 10:03:26.809 org.apache.hadoop.mapred.JobClient (Counters.java:518) -    
FILE_BYTES_READ=4387
 INFO 10:03:26.809 org.apache.hadoop.mapred.JobClient (Counters.java:518) -    
HDFS_BYTES_READ=5073
 INFO 10:03:26.809 org.apache.hadoop.mapred.JobClient (Counters.java:518) -    
FILE_BYTES_WRITTEN=8844
 INFO 10:03:26.824 org.apache.hadoop.mapred.JobClient (Counters.java:516) -  
Map-Reduce Framework
 INFO 10:03:26.824 org.apache.hadoop.mapred.JobClient (Counters.java:518) -    
Reduce input groups=100
 INFO 10:03:26.824 org.apache.hadoop.mapred.JobClient (Counters.java:518) -    
Combine output records=0
 INFO 10:03:26.824 org.apache.hadoop.mapred.JobClient (Counters.java:518) -    
Map input records=100
 INFO 10:03:26.824 org.apache.hadoop.mapred.JobClient (Counters.java:518) -    
Reduce shuffle bytes=4393
 INFO 10:03:26.824 org.apache.hadoop.mapred.JobClient (Counters.java:518) -    
Reduce output records=100
 INFO 10:03:26.824 org.apache.hadoop.mapred.JobClient (Counters.java:518) -    
Spilled Records=200
 INFO 10:03:26.824 org.apache.hadoop.mapred.JobClient (Counters.java:518) -    
Map output bytes=4181
 INFO 10:03:26.824 org.apache.hadoop.mapred.JobClient (Counters.java:518) -    
Map input bytes=3381
 INFO 10:03:26.824 org.apache.hadoop.mapred.JobClient (Counters.java:518) -    
Combine input records=0
 INFO 10:03:26.824 org.apache.hadoop.mapred.JobClient (Counters.java:518) -    
Map output records=100
 INFO 10:03:26.824 org.apache.hadoop.mapred.JobClient (Counters.java:518) -    
Reduce input records=100


Now, onward to learning about and using PIG....

_mc

-- 
View this message in context: http://www.nabble.com/Exception-in-rowcount-program-tp23195921p24954786.html
Sent from the HBase User mailing list archive at Nabble.com.


Re: Exception in rowcount program

Posted by velo0001 <ve...@myid.me>.
I am running it in Eclipse on a windows machine client, going to a Linux
machine where the Hadoop 0.20.0 DFS is deployed and the HBase 0.20.0 is
deployed. I have used this WIN client machine for inserting row columns (up
to a million rows with 3 columns each so far in my testing) and reading back
existing row/columns, so I am relatively sure that all my conf setups on the
windows client and the Linux server are correct. Everything else seems to
run OK for me, except map reduce.

Maybe map reduce processes are not designed to be run this way?

And for the reply above yours, that package info is exactly what I used when
setting things up. Look at my HADOOP_CLASSPATH export - it is what is shown
in that information. And I have dbl checked the pathing and it is all
correct for the Linux server that Hadoop and HBase are on.

I am about to start playing with PIG, but I need to get map reduce working
too.

I will probably try copying the hbase-site.xml and hbase jar file into
hadoop location, but like I said I really don't want that to be the answer.

THx, for any help....

_mc



stack-3 wrote:
> 
> How are you running it?
> 
> $ ./bin/hadoop jar hbase.jar...
> 
> .. or some other way?
> 
> St.Ack
> 
> On Wed, Aug 12, 2009 at 12:36 PM, velo0001 <ve...@myid.me> wrote:
> 
>>
>>
>>
>> Erik Holstad wrote:
>> >
>> >>  you followed the instructions on
>> > http://wiki.apache.org/hadoop/Hbase/MapReduce
>> > and it didn't work for you?
>> >
>>
>> I followed exactly the instructions given above. But I am getting the
>> same
>> type  of error for my map reduce, a class not found.
>>
>> (jar –tfv shows me that the ImmutableBytesWritable class does exist in
>> the
>> 0.20 hadoop jar - I am using 0.20 version)
>>
>> Here is my export in hadoop-env.sh:
>>
>> export
>>
>> HADOOP_CLASSPATH=/usr/hbase_install/hbase-0.20.0/hbase-0.20.0.jar:/usr/hbase_install/hbase-0.20.0/hbase-0.20.0-test.jar:/usr/hbase_install/hbase-0.20.0/hbase/conf
>>
>> (I could try copying the hadoop 0.20 jar and the hbase-site.xml into my
>> hadoop conf dir, but I really don’t want to use that approach)
>>
>> Any ideas why I continue to get this error:
>>
>>
>> DEBUG 14:53:49.536 main org.apache.hadoop.mapred.JobClient(776) -
>> Creating
>> splits at
>> hdfs://
>> devdkvstore.headquarters.socketware.com/tmp/hadoop-root/mapred/system/job_200908121217_0003/job.split
>>  INFO 14:53:49.552 main org.apache.hadoop.mapred.FileInputFormat(192) -
>> Total input paths to process : 1
>> DEBUG 14:53:49.770 main org.apache.hadoop.mapred.FileInputFormat(248) -
>> Total # of splits: 2
>>  INFO 14:53:50.923 main org.apache.hadoop.mapred.JobClient(1278) -
>> Running
>> job: job_200908121217_0003
>>  INFO 14:53:51.936 main org.apache.hadoop.mapred.JobClient(1291) -  map
>> 0%
>> reduce 0%
>>  INFO 14:54:06.802 main org.apache.hadoop.mapred.JobClient(1320) - Task
>> Id
>> :
>> attempt_200908121217_0003_m_000000_0, Status : FAILED
>> java.lang.RuntimeException: java.lang.RuntimeException:
>> java.lang.ClassNotFoundException:
>> org.apache.hadoop.hbase.io.ImmutableBytesWritable
>>        at
>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:840)
>>        at
>> org.apache.hadoop.mapred.JobConf.getMapOutputKeyClass(JobConf.java:590)
>>        at
>> org.apache.hadoop.mapred.JobConf.getOutputKeyComparator(JobConf.java:664)
>>        at
>> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.<init>(MapTask.java:689)
>>        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:348)
>>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
>>        at org.apache.hadoop.mapred.Child.main(Child.java:170)
>> Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException:
>> org.apache.hadoop.hbase.io.ImmutableBytesWritable
>>        at
>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:808)
>>        at
>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:832)
>>        ... 6 more
>> Caused by: java.lang.ClassNotFoundException:
>> org.apache.hadoop.hbase.io.ImmutableBytesWritable
>>        at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
>>        at java.security.AccessController.doPrivileged(Native Method)
>>        at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
>>        at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>>        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>>        at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
>>        at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
>>        at java.lang.Class.forName0(Native Method)
>>        at java.lang.Class.forName(Class.java:247)
>>        at
>> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:761)
>>        at
>> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:806)
>>        ... 7 more
>>
>>
>>
>> --
>> View this message in context:
>> http://www.nabble.com/Exception-in-rowcount-program-tp23195921p24943168.html
>> Sent from the HBase User mailing list archive at Nabble.com.
>>
>>
> 
> 

-- 
View this message in context: http://www.nabble.com/Exception-in-rowcount-program-tp23195921p24952886.html
Sent from the HBase User mailing list archive at Nabble.com.


Re: Exception in rowcount program

Posted by stack <st...@duboce.net>.
How are you running it?

$ ./bin/hadoop jar hbase.jar...

.. or some other way?

St.Ack

On Wed, Aug 12, 2009 at 12:36 PM, velo0001 <ve...@myid.me> wrote:

>
>
>
> Erik Holstad wrote:
> >
> >>  you followed the instructions on
> > http://wiki.apache.org/hadoop/Hbase/MapReduce
> > and it didn't work for you?
> >
>
> I followed exactly the instructions given above. But I am getting the same
> type  of error for my map reduce, a class not found.
>
> (jar –tfv shows me that the ImmutableBytesWritable class does exist in the
> 0.20 hadoop jar - I am using 0.20 version)
>
> Here is my export in hadoop-env.sh:
>
> export
>
> HADOOP_CLASSPATH=/usr/hbase_install/hbase-0.20.0/hbase-0.20.0.jar:/usr/hbase_install/hbase-0.20.0/hbase-0.20.0-test.jar:/usr/hbase_install/hbase-0.20.0/hbase/conf
>
> (I could try copying the hadoop 0.20 jar and the hbase-site.xml into my
> hadoop conf dir, but I really don’t want to use that approach)
>
> Any ideas why I continue to get this error:
>
>
> DEBUG 14:53:49.536 main org.apache.hadoop.mapred.JobClient(776) - Creating
> splits at
> hdfs://
> devdkvstore.headquarters.socketware.com/tmp/hadoop-root/mapred/system/job_200908121217_0003/job.split
>  INFO 14:53:49.552 main org.apache.hadoop.mapred.FileInputFormat(192) -
> Total input paths to process : 1
> DEBUG 14:53:49.770 main org.apache.hadoop.mapred.FileInputFormat(248) -
> Total # of splits: 2
>  INFO 14:53:50.923 main org.apache.hadoop.mapred.JobClient(1278) - Running
> job: job_200908121217_0003
>  INFO 14:53:51.936 main org.apache.hadoop.mapred.JobClient(1291) -  map 0%
> reduce 0%
>  INFO 14:54:06.802 main org.apache.hadoop.mapred.JobClient(1320) - Task Id
> :
> attempt_200908121217_0003_m_000000_0, Status : FAILED
> java.lang.RuntimeException: java.lang.RuntimeException:
> java.lang.ClassNotFoundException:
> org.apache.hadoop.hbase.io.ImmutableBytesWritable
>        at
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:840)
>        at
> org.apache.hadoop.mapred.JobConf.getMapOutputKeyClass(JobConf.java:590)
>        at
> org.apache.hadoop.mapred.JobConf.getOutputKeyComparator(JobConf.java:664)
>        at
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.<init>(MapTask.java:689)
>        at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:348)
>        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:305)
>        at org.apache.hadoop.mapred.Child.main(Child.java:170)
> Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException:
> org.apache.hadoop.hbase.io.ImmutableBytesWritable
>        at
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:808)
>        at
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:832)
>        ... 6 more
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.hbase.io.ImmutableBytesWritable
>        at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
>        at java.security.AccessController.doPrivileged(Native Method)
>        at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
>        at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>        at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
>        at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
>        at java.lang.Class.forName0(Native Method)
>        at java.lang.Class.forName(Class.java:247)
>        at
> org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:761)
>        at
> org.apache.hadoop.conf.Configuration.getClass(Configuration.java:806)
>        ... 7 more
>
>
>
> --
> View this message in context:
> http://www.nabble.com/Exception-in-rowcount-program-tp23195921p24943168.html
> Sent from the HBase User mailing list archive at Nabble.com.
>
>