You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Venkatesh <vr...@aol.com> on 2011/04/01 18:06:58 UTC

row_counter map reduce job & 0.90.1

 

 I'm able to run this job from the hadoop machine (where job & task tracker also runs)
/hadoop jar /home/maryama/hbase-0.90.1/hbase-0.90.1.jar rowcounter <usertable>

But, I'm not able to run the same job from
a) hbase client machine (full hbase & hadoop installed)
b) hbase server machines (ditto)

Get 
File /home/.../hdfs/tmp/mapred/system/job_201103311630_0024/libjars/hadoop-0.20.2-core.jar does not exist.

Any idea how this jar file get packaged/where is it looking for?

thanks
v




Re: row_counter map reduce job & 0.90.1

Posted by Ted Yu <yu...@gmail.com>.
I answered the same question under 'Row Counters' thread.
In short, you should use the following command:

[hadoop@us01-ciqps1-name01 hbase]$ HADOOP_CLASSPATH=`${HBASE_
HOME}/bin/hbase classpath` ${HADOOP_HOME}/bin/hadoop jar
${HBASE_HOME}/hbase-0.90.1.jar rowcounter packageindex


On Fri, Apr 1, 2011 at 9:26 AM, Venkatesh <vr...@aol.com> wrote:

> Definitely yes..It'all referenced in -classpath option of jvm of
> tasktracker/jobtracker/datanode/namenode..
> & file does exist in the cluster..
>
> But the error I get is on the client
> File
> /home..../hdfs/tmp/mapred/system/job_201103311630_0027/libjars/hadoop-0.20.2-core.jar
> does not exist.
>    at
> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:361)
>    at
> org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:245)
>    at
> org.apache.hadoop.filecache.DistributedCache.getTimestamp(DistributedCache.java:509)
>    at
> org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:629)
>    at
> org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:761)
>    at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)
>
> So, in theory in should n't expect from client ..correct?
>
> This is the only that is stopping me in moving to 0.90.1
>
>
>
>
>
>
>
>
>
>
> -----Original Message-----
> From: Stack <st...@duboce.net>
> To: user@hbase.apache.org
> Sent: Fri, Apr 1, 2011 12:19 pm
> Subject: Re: row_counter map reduce job & 0.90.1
>
>
> On Fri, Apr 1, 2011 at 9:06 AM, Venkatesh <vr...@aol.com> wrote:
>
> >  I'm able to run this job from the hadoop machine (where job & task
> tracker
>
> also runs)
>
> > /hadoop jar /home/maryama/hbase-0.90.1/hbase-0.90.1.jar rowcounter
> <usertable>
>
> >
>
> > But, I'm not able to run the same job from
>
> > a) hbase client machine (full hbase & hadoop installed)
>
> > b) hbase server machines (ditto)
>
> >
>
> > Get
>
> > File
> /home/.../hdfs/tmp/mapred/system/job_201103311630_0024/libjars/hadoop-0.20.2-core.jar
>
> does not exist.
>
> >
>
>
>
> Is that jar present on the cluster?
>
> St.Ack
>
>
>
>

Re: row_counter map reduce job & 0.90.1

Posted by Stack <st...@duboce.net>.
I'm glad you figured it Venkatesh.  St.Ack

On Mon, Apr 4, 2011 at 10:57 AM, Venkatesh <vr...@aol.com> wrote:
> Sorry about this..It was indeed an environment issue..my core-site.xml was pointing to wrong hadoop
> thanks for the tips
>
>
>
>
>
>
>
>
>
>
> -----Original Message-----
> From: Venkatesh <vr...@aol.com>
> To: user@hbase.apache.org
> Sent: Fri, Apr 1, 2011 4:51 pm
> Subject: Re: row_counter map reduce job & 0.90.1
>
>
>
>
>  Yeah.. I tried that as well as what Ted suggested..It can't find hadoop jar
>
> Hadoop map reduce jobs works fine ..it's just hbase map reduce jobs fails with
>
> this error
>
> tx
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> -----Original Message-----
>
> From: Stack <st...@duboce.net>
>
> To: user@hbase.apache.org
>
> Sent: Fri, Apr 1, 2011 12:39 pm
>
> Subject: Re: row_counter map reduce job & 0.90.1
>
>
>
>
>
> Does where you are running from have a build/classes dir and a
>
>
>
> hadoop-0.20.2-core.jar at top level?  If so, try cleaning out the
>
>
>
> build/classes.  Also, could try something like this:
>
>
>
>
>
>
>
> HADOOP_CLASSPATH=/home/stack/hbase-0.90.2-SNAPSHOT/hbase-0.90.2-SNAPSHOT-tests.jar:/home/stack/hbase-0.90.2-SNAPSHOT/hbase-0.90.2-SNAPSHOT.jar:`/home/stack/hbase-0.90.2-SNAPSHOT/bin/hbase
>
>
>
> classpath` ./bin/hadoop jar
>
>
>
> /home/stack/hbase-0.90.2-SNAPSHOT/hbase-0.90.2-SNAPSHOT.jar rowcounter
>
>
>
> usertable
>
>
>
>
>
>
>
> ... only make sure the hadoop jar is in HADOOP_CLASSPATH.
>
>
>
>
>
>
>
> But you shouldn't have to do the latter at least.  Compare where it
>
>
>
> works to where it doesn't.  Something is different.
>
>
>
>
>
>
>
> St.Ack
>
>
>
>
>
>
>
> On Fri, Apr 1, 2011 at 9:26 AM, Venkatesh <vr...@aol.com> wrote:
>
>
>
>> Definitely yes..It'all referenced in -classpath option of jvm of
>
>
>
> tasktracker/jobtracker/datanode/namenode..
>
>
>
>> & file does exist in the cluster..
>
>
>
>>
>
>
>
>> But the error I get is on the client
>
>
>
>> File /home..../hdfs/tmp/mapred/system/job_201103311630_0027/libjars/hadoop-0.20.2-core.jar
>
>
>
>
>
> does not exist.
>
>
>
>>    at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:361)
>
>
>
>>    at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:245)
>
>
>
>>    at org.apache.hadoop.filecache.DistributedCache.getTimestamp(DistributedCache.java:509)
>
>
>
>>    at org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:629)
>
>
>
>>    at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:761)
>
>
>
>>    at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)
>
>
>
>>
>
>
>
>> So, in theory in should n't expect from client ..correct?
>
>
>
>>
>
>
>
>> This is the only that is stopping me in moving to 0.90.1
>
>
>
>>
>
>
>
>>
>
>
>
>>
>
>
>
>>
>
>
>
>>
>
>
>
>>
>
>
>
>>
>
>
>
>>
>
>
>
>>
>
>
>
>>
>
>
>
>> -----Original Message-----
>
>
>
>> From: Stack <st...@duboce.net>
>
>
>
>> To: user@hbase.apache.org
>
>
>
>> Sent: Fri, Apr 1, 2011 12:19 pm
>
>
>
>> Subject: Re: row_counter map reduce job & 0.90.1
>
>
>
>>
>
>
>
>>
>
>
>
>> On Fri, Apr 1, 2011 at 9:06 AM, Venkatesh <vr...@aol.com> wrote:
>
>
>
>>
>
>
>
>>>  I'm able to run this job from the hadoop machine (where job & task tracker
>
>
>
>>
>
>
>
>> also runs)
>
>
>
>>
>
>
>
>>> /hadoop jar /home/maryama/hbase-0.90.1/hbase-0.90.1.jar rowcounter
>
>
>
> <usertable>
>
>
>
>>
>
>
>
>>>
>
>
>
>>
>
>
>
>>> But, I'm not able to run the same job from
>
>
>
>>
>
>
>
>>> a) hbase client machine (full hbase & hadoop installed)
>
>
>
>>
>
>
>
>>> b) hbase server machines (ditto)
>
>
>
>>
>
>
>
>>>
>
>
>
>>
>
>
>
>>> Get
>
>
>
>>
>
>
>
>>> File /home/.../hdfs/tmp/mapred/system/job_201103311630_0024/libjars/hadoop-0.20.2-core.jar
>
>
>
>>
>
>
>
>> does not exist.
>
>
>
>>
>
>
>
>>>
>
>
>
>>
>
>
>
>>
>
>
>
>>
>
>
>
>> Is that jar present on the cluster?
>
>
>
>>
>
>
>
>> St.Ack
>
>
>
>>
>
>
>
>>
>
>
>
>>
>
>
>
>>
>
>
>
>
>
>
>
>
>
>

Re: row_counter map reduce job & 0.90.1

Posted by Venkatesh <vr...@aol.com>.
Sorry about this..It was indeed an environment issue..my core-site.xml was pointing to wrong hadoop
thanks for the tips

 

 


 

 

-----Original Message-----
From: Venkatesh <vr...@aol.com>
To: user@hbase.apache.org
Sent: Fri, Apr 1, 2011 4:51 pm
Subject: Re: row_counter map reduce job & 0.90.1




 Yeah.. I tried that as well as what Ted suggested..It can't find hadoop jar

Hadoop map reduce jobs works fine ..it's just hbase map reduce jobs fails with 

this error

tx



 





 



 



-----Original Message-----

From: Stack <st...@duboce.net>

To: user@hbase.apache.org

Sent: Fri, Apr 1, 2011 12:39 pm

Subject: Re: row_counter map reduce job & 0.90.1





Does where you are running from have a build/classes dir and a



hadoop-0.20.2-core.jar at top level?  If so, try cleaning out the



build/classes.  Also, could try something like this:







HADOOP_CLASSPATH=/home/stack/hbase-0.90.2-SNAPSHOT/hbase-0.90.2-SNAPSHOT-tests.jar:/home/stack/hbase-0.90.2-SNAPSHOT/hbase-0.90.2-SNAPSHOT.jar:`/home/stack/hbase-0.90.2-SNAPSHOT/bin/hbase



classpath` ./bin/hadoop jar



/home/stack/hbase-0.90.2-SNAPSHOT/hbase-0.90.2-SNAPSHOT.jar rowcounter



usertable







... only make sure the hadoop jar is in HADOOP_CLASSPATH.







But you shouldn't have to do the latter at least.  Compare where it



works to where it doesn't.  Something is different.







St.Ack







On Fri, Apr 1, 2011 at 9:26 AM, Venkatesh <vr...@aol.com> wrote:



> Definitely yes..It'all referenced in -classpath option of jvm of 



tasktracker/jobtracker/datanode/namenode..



> & file does exist in the cluster..



>



> But the error I get is on the client



> File /home..../hdfs/tmp/mapred/system/job_201103311630_0027/libjars/hadoop-0.20.2-core.jar 





does not exist.



>    at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:361)



>    at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:245)



>    at org.apache.hadoop.filecache.DistributedCache.getTimestamp(DistributedCache.java:509)



>    at org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:629)



>    at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:761)



>    at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)



>



> So, in theory in should n't expect from client ..correct?



>



> This is the only that is stopping me in moving to 0.90.1



>



>



>



>



>



>



>



>



>



>



> -----Original Message-----



> From: Stack <st...@duboce.net>



> To: user@hbase.apache.org



> Sent: Fri, Apr 1, 2011 12:19 pm



> Subject: Re: row_counter map reduce job & 0.90.1



>



>



> On Fri, Apr 1, 2011 at 9:06 AM, Venkatesh <vr...@aol.com> wrote:



>



>>  I'm able to run this job from the hadoop machine (where job & task tracker



>



> also runs)



>



>> /hadoop jar /home/maryama/hbase-0.90.1/hbase-0.90.1.jar rowcounter 



<usertable>



>



>>



>



>> But, I'm not able to run the same job from



>



>> a) hbase client machine (full hbase & hadoop installed)



>



>> b) hbase server machines (ditto)



>



>>



>



>> Get



>



>> File /home/.../hdfs/tmp/mapred/system/job_201103311630_0024/libjars/hadoop-0.20.2-core.jar



>



> does not exist.



>



>>



>



>



>



> Is that jar present on the cluster?



>



> St.Ack



>



>



>



>





 


 

Re: row_counter map reduce job & 0.90.1

Posted by Venkatesh <vr...@aol.com>.
 Yeah.. I tried that as well as what Ted suggested..It can't find hadoop jar
Hadoop map reduce jobs works fine ..it's just hbase map reduce jobs fails with this error
tx

 


 

 

-----Original Message-----
From: Stack <st...@duboce.net>
To: user@hbase.apache.org
Sent: Fri, Apr 1, 2011 12:39 pm
Subject: Re: row_counter map reduce job & 0.90.1


Does where you are running from have a build/classes dir and a

hadoop-0.20.2-core.jar at top level?  If so, try cleaning out the

build/classes.  Also, could try something like this:



HADOOP_CLASSPATH=/home/stack/hbase-0.90.2-SNAPSHOT/hbase-0.90.2-SNAPSHOT-tests.jar:/home/stack/hbase-0.90.2-SNAPSHOT/hbase-0.90.2-SNAPSHOT.jar:`/home/stack/hbase-0.90.2-SNAPSHOT/bin/hbase

classpath` ./bin/hadoop jar

/home/stack/hbase-0.90.2-SNAPSHOT/hbase-0.90.2-SNAPSHOT.jar rowcounter

usertable



... only make sure the hadoop jar is in HADOOP_CLASSPATH.



But you shouldn't have to do the latter at least.  Compare where it

works to where it doesn't.  Something is different.



St.Ack



On Fri, Apr 1, 2011 at 9:26 AM, Venkatesh <vr...@aol.com> wrote:

> Definitely yes..It'all referenced in -classpath option of jvm of 

tasktracker/jobtracker/datanode/namenode..

> & file does exist in the cluster..

>

> But the error I get is on the client

> File /home..../hdfs/tmp/mapred/system/job_201103311630_0027/libjars/hadoop-0.20.2-core.jar 

does not exist.

>    at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:361)

>    at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:245)

>    at org.apache.hadoop.filecache.DistributedCache.getTimestamp(DistributedCache.java:509)

>    at org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:629)

>    at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:761)

>    at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)

>

> So, in theory in should n't expect from client ..correct?

>

> This is the only that is stopping me in moving to 0.90.1

>

>

>

>

>

>

>

>

>

>

> -----Original Message-----

> From: Stack <st...@duboce.net>

> To: user@hbase.apache.org

> Sent: Fri, Apr 1, 2011 12:19 pm

> Subject: Re: row_counter map reduce job & 0.90.1

>

>

> On Fri, Apr 1, 2011 at 9:06 AM, Venkatesh <vr...@aol.com> wrote:

>

>>  I'm able to run this job from the hadoop machine (where job & task tracker

>

> also runs)

>

>> /hadoop jar /home/maryama/hbase-0.90.1/hbase-0.90.1.jar rowcounter 

<usertable>

>

>>

>

>> But, I'm not able to run the same job from

>

>> a) hbase client machine (full hbase & hadoop installed)

>

>> b) hbase server machines (ditto)

>

>>

>

>> Get

>

>> File /home/.../hdfs/tmp/mapred/system/job_201103311630_0024/libjars/hadoop-0.20.2-core.jar

>

> does not exist.

>

>>

>

>

>

> Is that jar present on the cluster?

>

> St.Ack

>

>

>

>


 

Re: row_counter map reduce job & 0.90.1

Posted by Stack <st...@duboce.net>.
Does where you are running from have a build/classes dir and a
hadoop-0.20.2-core.jar at top level?  If so, try cleaning out the
build/classes.  Also, could try something like this:

HADOOP_CLASSPATH=/home/stack/hbase-0.90.2-SNAPSHOT/hbase-0.90.2-SNAPSHOT-tests.jar:/home/stack/hbase-0.90.2-SNAPSHOT/hbase-0.90.2-SNAPSHOT.jar:`/home/stack/hbase-0.90.2-SNAPSHOT/bin/hbase
classpath` ./bin/hadoop jar
/home/stack/hbase-0.90.2-SNAPSHOT/hbase-0.90.2-SNAPSHOT.jar rowcounter
usertable

... only make sure the hadoop jar is in HADOOP_CLASSPATH.

But you shouldn't have to do the latter at least.  Compare where it
works to where it doesn't.  Something is different.

St.Ack

On Fri, Apr 1, 2011 at 9:26 AM, Venkatesh <vr...@aol.com> wrote:
> Definitely yes..It'all referenced in -classpath option of jvm of tasktracker/jobtracker/datanode/namenode..
> & file does exist in the cluster..
>
> But the error I get is on the client
> File /home..../hdfs/tmp/mapred/system/job_201103311630_0027/libjars/hadoop-0.20.2-core.jar does not exist.
>    at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:361)
>    at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:245)
>    at org.apache.hadoop.filecache.DistributedCache.getTimestamp(DistributedCache.java:509)
>    at org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:629)
>    at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:761)
>    at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)
>
> So, in theory in should n't expect from client ..correct?
>
> This is the only that is stopping me in moving to 0.90.1
>
>
>
>
>
>
>
>
>
>
> -----Original Message-----
> From: Stack <st...@duboce.net>
> To: user@hbase.apache.org
> Sent: Fri, Apr 1, 2011 12:19 pm
> Subject: Re: row_counter map reduce job & 0.90.1
>
>
> On Fri, Apr 1, 2011 at 9:06 AM, Venkatesh <vr...@aol.com> wrote:
>
>>  I'm able to run this job from the hadoop machine (where job & task tracker
>
> also runs)
>
>> /hadoop jar /home/maryama/hbase-0.90.1/hbase-0.90.1.jar rowcounter <usertable>
>
>>
>
>> But, I'm not able to run the same job from
>
>> a) hbase client machine (full hbase & hadoop installed)
>
>> b) hbase server machines (ditto)
>
>>
>
>> Get
>
>> File /home/.../hdfs/tmp/mapred/system/job_201103311630_0024/libjars/hadoop-0.20.2-core.jar
>
> does not exist.
>
>>
>
>
>
> Is that jar present on the cluster?
>
> St.Ack
>
>
>
>

Re: row_counter map reduce job & 0.90.1

Posted by Venkatesh <vr...@aol.com>.
Definitely yes..It'all referenced in -classpath option of jvm of tasktracker/jobtracker/datanode/namenode..
& file does exist in the cluster..

But the error I get is on the client
File /home..../hdfs/tmp/mapred/system/job_201103311630_0027/libjars/hadoop-0.20.2-core.jar does not exist.
    at org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:361)
    at org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:245)
    at org.apache.hadoop.filecache.DistributedCache.getTimestamp(DistributedCache.java:509)
    at org.apache.hadoop.mapred.JobClient.configureCommandLineOptions(JobClient.java:629)
    at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:761)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)

So, in theory in should n't expect from client ..correct?

This is the only that is stopping me in moving to 0.90.1



 


 

 

-----Original Message-----
From: Stack <st...@duboce.net>
To: user@hbase.apache.org
Sent: Fri, Apr 1, 2011 12:19 pm
Subject: Re: row_counter map reduce job & 0.90.1


On Fri, Apr 1, 2011 at 9:06 AM, Venkatesh <vr...@aol.com> wrote:

>  I'm able to run this job from the hadoop machine (where job & task tracker 

also runs)

> /hadoop jar /home/maryama/hbase-0.90.1/hbase-0.90.1.jar rowcounter <usertable>

>

> But, I'm not able to run the same job from

> a) hbase client machine (full hbase & hadoop installed)

> b) hbase server machines (ditto)

>

> Get

> File /home/.../hdfs/tmp/mapred/system/job_201103311630_0024/libjars/hadoop-0.20.2-core.jar 

does not exist.

>



Is that jar present on the cluster?

St.Ack


 

Re: row_counter map reduce job & 0.90.1

Posted by Stack <st...@duboce.net>.
On Fri, Apr 1, 2011 at 9:06 AM, Venkatesh <vr...@aol.com> wrote:
>  I'm able to run this job from the hadoop machine (where job & task tracker also runs)
> /hadoop jar /home/maryama/hbase-0.90.1/hbase-0.90.1.jar rowcounter <usertable>
>
> But, I'm not able to run the same job from
> a) hbase client machine (full hbase & hadoop installed)
> b) hbase server machines (ditto)
>
> Get
> File /home/.../hdfs/tmp/mapred/system/job_201103311630_0024/libjars/hadoop-0.20.2-core.jar does not exist.
>

Is that jar present on the cluster?
St.Ack