You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by praveenesh kumar <pr...@gmail.com> on 2011/06/06 10:48:49 UTC

Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Hi,

Not able to see my email in the mail archive..So sending it again...!!!
Guys.. need your feedback..!!

Thanks,
Praveenesh
---------- Forwarded message ----------
From: praveenesh kumar <pr...@gmail.com>
Date: Mon, Jun 6, 2011 at 12:09 PM
Subject: Hadoop is not working after adding
hadoop-core-0.20-append-r1056497.jar
To: common-user@hadoop.apache.org, user@hbase.apache.org


Hello guys..!!!

I am currently working on Hbase 0.90.3 and Hadoop 0.20.2

Since this hadoop version does not support rsync hdfs..
so I copied the *hadoop-core-append jar*  file from *hbase/lib* folder
into*hadoop folder
* and replaced it with* hadoop-0.20.2-core.jar*
which was suggested in the following link

http://www.apacheserver.net/Using-Hadoop-bundled-in-lib-directory-HBase-at1136240.htm

I guess this is what have been mentioned in the link that I am doing. If I
am doing somehting wrong, kindly tell me.

But now after adding that jar file.. I am not able to run my hadoop.. I am
getting following exception messages on my screen

ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/hadoop/util/PlatformName
ub13: Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.util.PlatformName
ub13:   at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
ub13:   at java.security.AccessController.doPrivileged(Native Method)
ub13:   at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
ub13:   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
ub13: Could not find the main class: org.apache.hadoop.util.PlatformName.
Program will exit.
ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/hadoop/hdfs/server/datanode/DataNode
ub13: starting secondarynamenode, logging to
/usr/local/hadoop/hadoop/bin/../logs/hadoop-hadoop-secondarynamenode-ub13.out
ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/hadoop/util/PlatformName
ub13: Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.util.PlatformName
ub13:   at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
ub13:   at java.security.AccessController.doPrivileged(Native Method)
ub13:   at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
ub13:   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
ub13: Could not find the main class: org.apache.hadoop.util.PlatformName.
Program will exit.
ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
org/apache/hadoop/hdfs/server/namenode/SecondaryNameNode
Have I done something wrong.. Please guide me...!!

Thanks,
Praveenesh

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by praveenesh kumar <pr...@gmail.com>.
It worked by renaming the hadoop-append*.jar file to hadoop-core.0.20.2.jar
file..I dnt know why.. but it worked..!!
Also.. After this thing.. my hbase started well for 1 time.. but after
that.. its not working..fine.. there is some problem is starting of region
servers..
I have send the exceptions in my other email.. I hope it will reach the
mailing group after some time.

Thanks,
Praveenesh

On Mon, Jun 6, 2011 at 8:59 PM, Stack <st...@duboce.net> wrote:

> On Mon, Jun 6, 2011 at 6:23 AM, praveenesh kumar <pr...@gmail.com>
> wrote:
> > Changing the name of the hadoop-apppend-core.jar file to
> > hadoop-0.20.2-core.jar did the trick..
> > Its working now..
> > But is this the right solution to this problem ??
> >
>
> It would seem to be.  Did you have two hadoop*jar versions in your lib
> directory by any chance?  You did not remove the first?
> St.Ack
>

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Stack <st...@duboce.net>.
On Mon, Jun 6, 2011 at 6:23 AM, praveenesh kumar <pr...@gmail.com> wrote:
> Changing the name of the hadoop-apppend-core.jar file to
> hadoop-0.20.2-core.jar did the trick..
> Its working now..
> But is this the right solution to this problem ??
>

It would seem to be.  Did you have two hadoop*jar versions in your lib
directory by any chance?  You did not remove the first?
St.Ack

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Mike Spreitzer <ms...@us.ibm.com>.
http://hbase.apache.org/notsoquick.html

is still out there, and its content is older than the book.  This is a 
problem in itself, and needs to be fixed.

Thanks,
Mike Spreitzer


saint.ack@gmail.com wrote on 06/09/2011 07:40:31 PM:

> From: Stack <st...@duboce.net>
> To: user@hbase.apache.org
> Date: 06/09/2011 07:42 PM
> Subject: Re: Hadoop not working after replacing hadoop-core.jar with
> hadoop-core-append.jar
> Sent by: saint.ack@gmail.com
> 
> On Tue, Jun 7, 2011 at 2:32 PM, Stack <st...@duboce.net> wrote:
> > On Mon, Jun 6, 2011 at 10:37 PM, Mike Spreitzer <ms...@us.ibm.com> 
wrote:
> >> So my
> >> suggestion is to be unequivocal about it: when running distributed, 
always
> >> build your own Hadoop and put its -core JAR into your HBase 
installation
> >> (or use Cloudera, which has done this for you).  Also: explicitly 
explain
> >> how the file has to be named (there is a strict naming requirement so 
that
> >> the launching scripts work, right?).
> >>
> 
> I made this change,
> http://svn.apache.org/viewvc?view=revision&revision=1134129 .  I
> removed the section where we talk of copying the hbase hadoop jar
> across a cluster.  I notice that in Michael Noll's blog he talks of
> renaming branch-0.20.append jar as Andy Zhong points out so will leave
> it at that (unless others have improvements).
> 
> St.Ack

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Michel Segel <mi...@hotmail.com>.
Just a simple suggestion...
When you install Hadoop and HBase, you may want to go into /usr/lib/hadoop and create a symbolic link called hadoop.jar that points to the current hadoop jar. Do the same for hbase.
Then make all of your references to these sym links and environment variables as needed.
Makes life a lot easier.


Sent from a remote device. Please excuse any typos...

Mike Segel

On Jun 9, 2011, at 6:40 PM, Stack <st...@duboce.net> wrote:

> On Tue, Jun 7, 2011 at 2:32 PM, Stack <st...@duboce.net> wrote:
>> On Mon, Jun 6, 2011 at 10:37 PM, Mike Spreitzer <ms...@us.ibm.com> wrote:
>>> So my
>>> suggestion is to be unequivocal about it: when running distributed, always
>>> build your own Hadoop and put its -core JAR into your HBase installation
>>> (or use Cloudera, which has done this for you).  Also: explicitly explain
>>> how the file has to be named (there is a strict naming requirement so that
>>> the launching scripts work, right?).
>>> 
> 
> I made this change,
> http://svn.apache.org/viewvc?view=revision&revision=1134129 .
> removed the section where we talk of copying the hbase hadoop jar
> across a cluster.  I notice that in Michael Noll's blog he talks of
> renaming branch-0.20.append jar as Andy Zhong points out so will leave
> it at that (unless others have improvements).
> 
> St.Ack
> 

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Stack <st...@duboce.net>.
On Tue, Jun 7, 2011 at 2:32 PM, Stack <st...@duboce.net> wrote:
> On Mon, Jun 6, 2011 at 10:37 PM, Mike Spreitzer <ms...@us.ibm.com> wrote:
>> So my
>> suggestion is to be unequivocal about it: when running distributed, always
>> build your own Hadoop and put its -core JAR into your HBase installation
>> (or use Cloudera, which has done this for you).  Also: explicitly explain
>> how the file has to be named (there is a strict naming requirement so that
>> the launching scripts work, right?).
>>

I made this change,
http://svn.apache.org/viewvc?view=revision&revision=1134129 .  I
removed the section where we talk of copying the hbase hadoop jar
across a cluster.  I notice that in Michael Noll's blog he talks of
renaming branch-0.20.append jar as Andy Zhong points out so will leave
it at that (unless others have improvements).

St.Ack

RE: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by "Zhong, Andy" <Sh...@searshc.com>.
Yes, name does matter. It should work after renaming
hadoop-core-append.jar with hadoop-core.jar. It may help if checking
http://www.michael-noll.com/blog/2011/04/14/building-an-hadoop-0-20-x-ve
rsion-for-hbase-0-90-2/
 

-----Original Message-----
From: saint.ack@gmail.com [mailto:saint.ack@gmail.com] On Behalf Of
Stack
Sent: Thursday, June 09, 2011 12:22 PM
To: user@hbase.apache.org
Subject: Re: Hadoop not working after replacing hadoop-core.jar with
hadoop-core-append.jar

On Tue, Jun 7, 2011 at 2:32 PM, Stack <st...@duboce.net> wrote:
> On Mon, Jun 6, 2011 at 10:37 PM, Mike Spreitzer <ms...@us.ibm.com>
wrote:
>> Also: explicitly explain
>> how the file has to be named (there is a strict naming requirement so

>> that the launching scripts work, right?).
>>

...

> Also, I second what Andrew says where I do not know of any place there

> the name of the jar is inscribed so how the jar is named should play 
> no role at all.
>

Maybe the name does matter.  Do you think you ran into the issue that
Hari figured at the end of this thread:
http://search-hadoop.com/m/JxAKc2fztAb2/Confusion+regarding+version+of+h
adoop+to+use+in+hbase+0.90.1&subj=Confusion+regarding+version+of+hadoop+
to+use+in+hbase+0+90+1

St.Ack

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Andrew Purtell <ap...@apache.org>.
> From: Stack <st...@duboce.net>
> > Also, I second what Andrew says where I do not know of
> > any place there the name of the jar is inscribed so how 
> > the jar is named should play no role at all.
> >
> 
> Maybe the name does matter.  Do you think you ran into
> the issue that Hari figured at the end of this thread:
> http://search-hadoop.com/m/JxAKc2fztAb2/Confusion+regarding+version+of+hadoop+to+use+in+hbase+0.90.1&subj=Confusion+regarding+version+of+hadoop+to+use+in+hbase+0+90+1
> 

I haven't seen that.

But then to be sure, make sure the Hadoop jars are named with the correct version string, i.e. if version is "0.20.2-append", then you should have hadoop-core-0.20.2-append.jar in lib/.

   - Andy


Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Stack <st...@duboce.net>.
On Tue, Jun 7, 2011 at 2:32 PM, Stack <st...@duboce.net> wrote:
> On Mon, Jun 6, 2011 at 10:37 PM, Mike Spreitzer <ms...@us.ibm.com> wrote:
>> Also: explicitly explain
>> how the file has to be named (there is a strict naming requirement so that
>> the launching scripts work, right?).
>>

...

> Also, I second what Andrew says where I do not know of any place there
> the name of the jar is inscribed so how the jar is named should play
> no role at all.
>

Maybe the name does matter.  Do you think you ran into the issue that
Hari figured at the end of this thread:
http://search-hadoop.com/m/JxAKc2fztAb2/Confusion+regarding+version+of+hadoop+to+use+in+hbase+0.90.1&subj=Confusion+regarding+version+of+hadoop+to+use+in+hbase+0+90+1

St.Ack

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Stack <st...@duboce.net>.
On Mon, Jun 6, 2011 at 10:37 PM, Mike Spreitzer <ms...@us.ibm.com> wrote:
> I have been looking at
>
> http://hbase.apache.org/notsoquick.html#hadoop
>
> which does NOT have that citation.

Owwww.  Sorry Mike.  That is a stale reference.  Thats a prob. on our
site.  Let me fix... (It has actually been changed in the code for a
long time, just not deployed...)


> So I never saw that before now.  It is
> indeed helpful.  But: must we really spend hours on flaky tests while
> building?

You mean, having to run tests building hadoop?  (I've heard this hours
long test aspect of hadoop build referred to as the 'joy of hadoop').
Yeah, the hadoop build is painful.   You could skip running tests?


> Also, it would comfort noobs like me if there were a bit of
> explanation relating to the hadoop build instructions, which confusingly
> seem to sometimes build native binaries and sometimes not.
>

Understood.  We punt to the hadoop project for this, shamefully.


> Note: the remark that
>
> "You have to replace it if you are running on an hadoop that is other
> than an exact match to the jar we ship with"
>
> suggests one could (if "exact match") go down the very path that we are
> trying to discourage (even when "exact match").  This is what I tried, and
> was then told is not reliable, then was told might work, but nobody was
> willing to tell me whether or how I could be sure it would work.

Yeah, a warranty is hard to find in these parts (This is the area the
commercial vendors hang out in).


> So my
> suggestion is to be unequivocal about it: when running distributed, always
> build your own Hadoop and put its -core JAR into your HBase installation
> (or use Cloudera, which has done this for you).  Also: explicitly explain
> how the file has to be named (there is a strict naming requirement so that
> the launching scripts work, right?).
>

Thanks for this.  Will do as you suggest (It has Andrew endorsement too).

Also, I second what Andrew says where I do not know of any place there
the name of the jar is inscribed so how the jar is named should play
no role at all.

St.Ack

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Andrew Purtell <ap...@apache.org>.
Pardon if I've missed something but I think this thread comes down to:

On Mon, 6/6/11, Mike Spreitzer <ms...@us.ibm.com> wrote:
> So my suggestion is to be unequivocal about it: when running
> distributed, always build your own Hadoop and put its -core
> JAR into your HBase installation (or use Cloudera, which has
> done this for you).

This is a good suggestion. We try. But it seems not all the avenues to this information are covered. Suggestions on where to improve are helpful. Our Web UIs put up a big fat warning. We cover this issue in the online book, but as this thread suggests, we might make that better by pulling in some of Michael Noll's material either directly with his permission or by reference. We don't "own" the Hadoop wiki so can't do anything there. Our own wiki (and website) needs a refresh. When that happens we can cover this issue, perhaps with a compatibility matrix (with links to build or distro instructions), somewhere up front.

> Also: explicitly explain how the file has to be named (there
> is a strict naming requirement so that the launching scripts
> work, right?).

In my experience what the jar is named is not important. Remove the old Hadoop jars from HBase lib/. Drop in a suitable Hadoop -append variant core jar. We have a "frankenbase" internal HBase version, and we simply do that and also replace the ZooKeeper jar (we have a variant of that which can do SASL authentication) and all is well.

   - Andy


Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Mike Spreitzer <ms...@us.ibm.com>.
I have been looking at

http://hbase.apache.org/notsoquick.html#hadoop

which does NOT have that citation.  So I never saw that before now.  It is 
indeed helpful.  But: must we really spend hours on flaky tests while 
building?  Also, it would comfort noobs like me if there were a bit of 
explanation relating to the hadoop build instructions, which confusingly 
seem to sometimes build native binaries and sometimes not.

Note: the remark that

"You have to replace it if you are running on an hadoop that is other
than an exact match to the jar we ship with"

suggests one could (if "exact match") go down the very path that we are 
trying to discourage (even when "exact match").  This is what I tried, and 
was then told is not reliable, then was told might work, but nobody was 
willing to tell me whether or how I could be sure it would work.  So my 
suggestion is to be unequivocal about it: when running distributed, always 
build your own Hadoop and put its -core JAR into your HBase installation 
(or use Cloudera, which has done this for you).  Also: explicitly explain 
how the file has to be named (there is a strict naming requirement so that 
the launching scripts work, right?).

Regards,
Mike Spreitzer




From:   stack <sa...@gmail.com>
To:     Mike Spreitzer/Watson/IBM@IBMUS
Cc:     user@hbase.apache.org
Date:   06/06/2011 10:58 PM
Subject:        Re: Hadoop not working after replacing hadoop-core.jar 
with hadoop-core-append.jar



On Mon, Jun 6, 2011 at 6:49 PM, Mike Spreitzer <ms...@us.ibm.com> 
wrote:
> Where is that citation of Michael Noll's nicely detailed instruction on 
how
> to build the append branch?
>

See Section 1.3.1.2 here
http://hbase.apache.org/book/notsoquick.html#requirements.  Look for
"Michael Noll has written a detailed blog, Building an Hadoop 0.20.x
version for HBase 0.90.2, on how to build an Hadoop from
branch-0.20-append. Recommended."

> Why does hbase include a hadoop-core.jar?  The instructions say I should
> replace it, so why am I given it in the first place?
>

You have to replace it if you are running on an hadoop that is other
than an exact match to the jar we ship with (If you are doing
standalone mode or if you are running unit tests, the jar is needed
since we have a bunch of Hadoop dependencies from our Configuration to
UI to MapReduce to Connection to HDFS etc.)

Again, I apologize for the fact that this is less-than smooth sailing.
 HBase project is in a bit of an awkward spot.  We're trying to put
the best face on it.  If you have any suggestions for how best we
might this, we are all ears.

Yours,
St.Ack


Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by "Michael G. Noll" <mi...@googlemail.com>.
Hi all,

I have updated my Hadoop+HBase article with the filenaming-related
information of the Hadoop core jar file (thanks for your notification,
St.Ack).


FYI, Hadoop 0.20.2 and 0.20.203.0 differ in how they find their hadoop-* JAR
files.

Hadoop 0.20.2 (bin/hadoop):

    for f in $HADOOP_HOME/hadoop-*-core.jar; do
      CLASSPATH=${CLASSPATH}:$f;
    done

Hadoop 0.20.203.0 (bin/hadoop):

    for f in $HADOOP_HOME/hadoop-core-*.jar; do
      CLASSPATH=${CLASSPATH}:$f;
    done

Hence I made a note to the article that you only need to rename the custom
built JAR files if you are running the older _0.20.2_ release.


Follow-up question:
Since both Hadoop 0.20.203.0 and HBase 0.90.3 were recently released (the
article is about 0.20.2 and 0.90.2), is there anything else I can update in
the article? For instance the section with the matrix showing version
compatibilities between Hadoop and HBase? Here at work we are still testing
our slightly customized Hadoop 0.20.203.0 version (we have integrated some
patches which are not in the stock release) and I haven't had the chance to
look at HBase 0.90.3 at all yet. Just shoot me a message and I'll integrate
the missing information in the article.


Andrew Purtell wrote:
> We cover this issue in the online book, but as this thread suggests,
> we might make that better by pulling in some of Michael Noll's material
> either directly with his permission or by reference. We don't "own" the
> Hadoop wiki so can't do anything there. Our own wiki (and website) needs
> a refresh. When that happens we can cover this issue, perhaps with a
> compatibility matrix (with links to build or distro instructions),
somewhere up front.

Oh, almost forgot to reply to that: Of course feel free to integrate my
article back into the HBase docs. I had already offered St.Ack to help out
with the writing but back in April the decision was made to just link to my
blog (i.e. a reference rather than a book/website/wiki update). Personally,
I'd prefer this information to be available in the official docs rather than
on an individual's website. Just let me know what you guys prefer.

Best,
Michael


PS: It's getting increasingly harder to keep track of all the Michael's on
this list. ;-)



On Fri, Jun 10, 2011 at 21:19, stack <sa...@gmail.com> wrote:

> Thank you Mike.  I made your suggested change and pushed to the site
> (takes an hour or two to be visible).  I also wrote Michael Noll to
> make stronger note on jar naming, that append jar does not conform
> (Michael is usually responsive so should get fixed soon).
>
> Thanks for the input.
>
> St.Ack
>
> On Fri, Jun 10, 2011 at 11:43 AM, Mike Spreitzer <ms...@us.ibm.com>
> wrote:
> > Thanks for the clarification.  How about the following update to the
> text?
> >  Where it currently says
> >
> > <para>Because HBase depends on Hadoop, it bundles an instance of the 240
> > Hadoop jar under its <filename>lib</filename> directory. The bundled 241
> > Hadoop was made ...
> >
> > we clarify that the bundled JAR is ONLY for use in standalone mode,
> perhaps
> > like this:
> >
> > <para>Because HBase depends on Hadoop, it bundles an instance of the 240
> > Hadoop jar under its <filename>lib</filename> directory --- but this is
> > <emphasis>only</emphasis> for use in standalone mode. The bundled 241
> Hadoop
> > was made ...
> >
> > Also, why does this text say 240 at first and then 241?  And why not
> > explicitly give the name of the JAR file in question, which is
> > hadoop-core-0.20-append-r1056497?
> >
> > Thanks,
> > Mike Spreitzer
> >
> >
> >
> >
> > From:        Stack <st...@duboce.net>
> > To:        user@hbase.apache.org
> > Date:        06/10/2011 02:33 PM
> > Subject:        Re: Hadoop not working after replacing hadoop-core.jar
> with
> > hadoop-core-append.jar
> > Sent by:        saint.ack@gmail.com
> > ________________________________
> >
> >
> > On Fri, Jun 10, 2011 at 10:06 AM, Mike Spreitzer <ms...@us.ibm.com>
> > wrote:
> >> stack <sa...@gmail.com> wrote on 06/06/2011 10:57:50 PM:
> >>> From: stack <sa...@gmail.com>
> >> Let me see if I have got this straight.  Hadoop branch-0.20-append is
> not
> >> an immutable thing, it has evolved a little over time.
> >
> > Yes.
> >
> >> The hadoop-core.jar that is included in the HBase distribution was built
> >> from
> >> some version of branch-0.20-append.  If my own Hadoop cluster is EXACTLY
> >> the same version of branch-0.20-append then I do not need to replace any
> >> files anywhere.  However, since nobody is telling me the version of
> >> branch-0.20-append from which HBase's hadoop-core.jar was built,
> >
> > It says so in the jar name.  The jar is called >0.20-append-r1056497.
> > The latter is the svn revision we built the jar from.
> >
> >
> >>  I can not
> >> in any case or way be confident that my cluster is running EXACTLY the
> >> same version even if it is branch-0.20-append.
> >
> > Not true.
> >
> >> So the net result is that
> >> in all distributed cases (except when I import pre-built Hadoop+HBase
> from
> >> Cloudera or elsewhere) I have to build branch-0.20-append and copy it's
> >> core JAR into my HBase lib.  Have I got this right?  The book still does
> >> not say that clearly.  In fact, the book still points to my old email
> >> saying I did it the other way around.  Your reply above clearly seems to
> >> imply that I need to replace HBase's hadoop core JAR only in some
> >> distributed cases.  Yet the rest of the email conversation on this point
> >> seems to have settled that HBase's hadoop core JAR needs to be replaced
> in
> >> all distributed cases.
> >>
> >
> > My reading of the text is that the jar should always be replaced.  I
> > added the callout that cited your old mail thread because I thought it
> > might be of interest.  It seems to have only made confusion so in the
> > edit I posted above (for review), I'd completely removed it.
> >
> > The edit has not been pushed to the website.  I thought I'd get a bit
> > of input on it first.  Here is the link again:
> > http://svn.apache.org/viewvc?view=revision&revision=1134129
> >
> > St.Ack
> >
> >
>

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by stack <sa...@gmail.com>.
Thank you Mike.  I made your suggested change and pushed to the site
(takes an hour or two to be visible).  I also wrote Michael Noll to
make stronger note on jar naming, that append jar does not conform
(Michael is usually responsive so should get fixed soon).

Thanks for the input.

St.Ack

On Fri, Jun 10, 2011 at 11:43 AM, Mike Spreitzer <ms...@us.ibm.com> wrote:
> Thanks for the clarification.  How about the following update to the text?
>  Where it currently says
>
> <para>Because HBase depends on Hadoop, it bundles an instance of the 240
> Hadoop jar under its <filename>lib</filename> directory. The bundled 241
> Hadoop was made ...
>
> we clarify that the bundled JAR is ONLY for use in standalone mode, perhaps
> like this:
>
> <para>Because HBase depends on Hadoop, it bundles an instance of the 240
> Hadoop jar under its <filename>lib</filename> directory --- but this is
> <emphasis>only</emphasis> for use in standalone mode. The bundled 241 Hadoop
> was made ...
>
> Also, why does this text say 240 at first and then 241?  And why not
> explicitly give the name of the JAR file in question, which is
> hadoop-core-0.20-append-r1056497?
>
> Thanks,
> Mike Spreitzer
>
>
>
>
> From:        Stack <st...@duboce.net>
> To:        user@hbase.apache.org
> Date:        06/10/2011 02:33 PM
> Subject:        Re: Hadoop not working after replacing hadoop-core.jar with
> hadoop-core-append.jar
> Sent by:        saint.ack@gmail.com
> ________________________________
>
>
> On Fri, Jun 10, 2011 at 10:06 AM, Mike Spreitzer <ms...@us.ibm.com>
> wrote:
>> stack <sa...@gmail.com> wrote on 06/06/2011 10:57:50 PM:
>>> From: stack <sa...@gmail.com>
>> Let me see if I have got this straight.  Hadoop branch-0.20-append is not
>> an immutable thing, it has evolved a little over time.
>
> Yes.
>
>> The hadoop-core.jar that is included in the HBase distribution was built
>> from
>> some version of branch-0.20-append.  If my own Hadoop cluster is EXACTLY
>> the same version of branch-0.20-append then I do not need to replace any
>> files anywhere.  However, since nobody is telling me the version of
>> branch-0.20-append from which HBase's hadoop-core.jar was built,
>
> It says so in the jar name.  The jar is called >0.20-append-r1056497.
> The latter is the svn revision we built the jar from.
>
>
>>  I can not
>> in any case or way be confident that my cluster is running EXACTLY the
>> same version even if it is branch-0.20-append.
>
> Not true.
>
>> So the net result is that
>> in all distributed cases (except when I import pre-built Hadoop+HBase from
>> Cloudera or elsewhere) I have to build branch-0.20-append and copy it's
>> core JAR into my HBase lib.  Have I got this right?  The book still does
>> not say that clearly.  In fact, the book still points to my old email
>> saying I did it the other way around.  Your reply above clearly seems to
>> imply that I need to replace HBase's hadoop core JAR only in some
>> distributed cases.  Yet the rest of the email conversation on this point
>> seems to have settled that HBase's hadoop core JAR needs to be replaced in
>> all distributed cases.
>>
>
> My reading of the text is that the jar should always be replaced.  I
> added the callout that cited your old mail thread because I thought it
> might be of interest.  It seems to have only made confusion so in the
> edit I posted above (for review), I'd completely removed it.
>
> The edit has not been pushed to the website.  I thought I'd get a bit
> of input on it first.  Here is the link again:
> http://svn.apache.org/viewvc?view=revision&revision=1134129
>
> St.Ack
>
>

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Mike Spreitzer <ms...@us.ibm.com>.
Thanks for the clarification.  How about the following update to the text? 
 Where it currently says

<para>Because HBase depends on Hadoop, it bundles an instance of the 240 
Hadoop jar under its <filename>lib</filename> directory. The bundled 241 
Hadoop was made ...

we clarify that the bundled JAR is ONLY for use in standalone mode, 
perhaps like this:

<para>Because HBase depends on Hadoop, it bundles an instance of the 240 
Hadoop jar under its <filename>lib</filename> directory --- but this is 
<emphasis>only</emphasis> for use in standalone mode. The bundled 241 
Hadoop was made ...

Also, why does this text say 240 at first and then 241?  And why not 
explicitly give the name of the JAR file in question, which is 
hadoop-core-0.20-append-r1056497?

Thanks,
Mike Spreitzer




From:   Stack <st...@duboce.net>
To:     user@hbase.apache.org
Date:   06/10/2011 02:33 PM
Subject:        Re: Hadoop not working after replacing hadoop-core.jar 
with hadoop-core-append.jar
Sent by:        saint.ack@gmail.com



On Fri, Jun 10, 2011 at 10:06 AM, Mike Spreitzer <ms...@us.ibm.com> 
wrote:
> stack <sa...@gmail.com> wrote on 06/06/2011 10:57:50 PM:
>> From: stack <sa...@gmail.com>
> Let me see if I have got this straight.  Hadoop branch-0.20-append is 
not
> an immutable thing, it has evolved a little over time.

Yes.

> The hadoop-core.jar that is included in the HBase distribution was built 
from
> some version of branch-0.20-append.  If my own Hadoop cluster is EXACTLY
> the same version of branch-0.20-append then I do not need to replace any
> files anywhere.  However, since nobody is telling me the version of
> branch-0.20-append from which HBase's hadoop-core.jar was built,

It says so in the jar name.  The jar is called >0.20-append-r1056497.
The latter is the svn revision we built the jar from.


>  I can not
> in any case or way be confident that my cluster is running EXACTLY the
> same version even if it is branch-0.20-append.

Not true.

> So the net result is that
> in all distributed cases (except when I import pre-built Hadoop+HBase 
from
> Cloudera or elsewhere) I have to build branch-0.20-append and copy it's
> core JAR into my HBase lib.  Have I got this right?  The book still does
> not say that clearly.  In fact, the book still points to my old email
> saying I did it the other way around.  Your reply above clearly seems to
> imply that I need to replace HBase's hadoop core JAR only in some
> distributed cases.  Yet the rest of the email conversation on this point
> seems to have settled that HBase's hadoop core JAR needs to be replaced 
in
> all distributed cases.
>

My reading of the text is that the jar should always be replaced.  I
added the callout that cited your old mail thread because I thought it
might be of interest.  It seems to have only made confusion so in the
edit I posted above (for review), I'd completely removed it.

The edit has not been pushed to the website.  I thought I'd get a bit
of input on it first.  Here is the link again:
http://svn.apache.org/viewvc?view=revision&revision=1134129

St.Ack


Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Stack <st...@duboce.net>.
On Fri, Jun 10, 2011 at 10:06 AM, Mike Spreitzer <ms...@us.ibm.com> wrote:
> stack <sa...@gmail.com> wrote on 06/06/2011 10:57:50 PM:
>> From: stack <sa...@gmail.com>
> Let me see if I have got this straight.  Hadoop branch-0.20-append is not
> an immutable thing, it has evolved a little over time.

Yes.

> The hadoop-core.jar that is included in the HBase distribution was built from
> some version of branch-0.20-append.  If my own Hadoop cluster is EXACTLY
> the same version of branch-0.20-append then I do not need to replace any
> files anywhere.  However, since nobody is telling me the version of
> branch-0.20-append from which HBase's hadoop-core.jar was built,

It says so in the jar name.  The jar is called >0.20-append-r1056497.
The latter is the svn revision we built the jar from.


>  I can not
> in any case or way be confident that my cluster is running EXACTLY the
> same version even if it is branch-0.20-append.

Not true.

> So the net result is that
> in all distributed cases (except when I import pre-built Hadoop+HBase from
> Cloudera or elsewhere) I have to build branch-0.20-append and copy it's
> core JAR into my HBase lib.  Have I got this right?  The book still does
> not say that clearly.  In fact, the book still points to my old email
> saying I did it the other way around.  Your reply above clearly seems to
> imply that I need to replace HBase's hadoop core JAR only in some
> distributed cases.  Yet the rest of the email conversation on this point
> seems to have settled that HBase's hadoop core JAR needs to be replaced in
> all distributed cases.
>

My reading of the text is that the jar should always be replaced.  I
added the callout that cited your old mail thread because I thought it
might be of interest.  It seems to have only made confusion so in the
edit I posted above (for review), I'd completely removed it.

The edit has not been pushed to the website.  I thought I'd get a bit
of input on it first.  Here is the link again:
http://svn.apache.org/viewvc?view=revision&revision=1134129

St.Ack

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Andrew Purtell <ap...@apache.org>.
We (at least I) was talking about the name of the Hadoop core jar in HBase lib/ being not of any particular importance.

Best regards,

    - Andy

Problems worthy of attack prove their worth by hitting back. - Piet Hein (via Tom White)


--- On Fri, 6/10/11, Mike Spreitzer <ms...@us.ibm.com> wrote:

> From: Mike Spreitzer <ms...@us.ibm.com>
> Subject: Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar
> To: user@hbase.apache.org
> Date: Friday, June 10, 2011, 10:10 AM
> Also, regarding names: yes, the JAR
> file names matter.  In Hadoop's 
> bin/hadoop I see this:
> 
> for f in $HADOOP_HOME/hadoop-core-*.jar; do
>   CLASSPATH=${CLASSPATH}:$f;
> done
> 
> Regards,
> Mike Spreitzer
> 

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Mike Spreitzer <ms...@us.ibm.com>.
Also, regarding names: yes, the JAR file names matter.  In Hadoop's 
bin/hadoop I see this:

for f in $HADOOP_HOME/hadoop-core-*.jar; do
  CLASSPATH=${CLASSPATH}:$f;
done

Regards,
Mike Spreitzer

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Mike Spreitzer <ms...@us.ibm.com>.
stack <sa...@gmail.com> wrote on 06/06/2011 10:57:50 PM:

> From: stack <sa...@gmail.com>
> To: Mike Spreitzer/Watson/IBM@IBMUS
> Cc: user@hbase.apache.org
> Date: 06/06/2011 10:58 PM
> Subject: Re: Hadoop not working after replacing hadoop-core.jar with
> hadoop-core-append.jar
> 
> ...
> 
> > Why does hbase include a hadoop-core.jar?  The instructions say I 
should
> > replace it, so why am I given it in the first place?
> >
> 
> You have to replace it if you are running on an hadoop that is other
> than an exact match to the jar we ship with (If you are doing
> standalone mode or if you are running unit tests, the jar is needed
> since we have a bunch of Hadoop dependencies from our Configuration to
> UI to MapReduce to Connection to HDFS etc.)
> 
> ...
> 
> Yours,
> St.Ack

Let me see if I have got this straight.  Hadoop branch-0.20-append is not 
an immutable thing, it has evolved a little over time.  The 
hadoop-core.jar that is included in the HBase distribution was built from 
some version of branch-0.20-append.  If my own Hadoop cluster is EXACTLY 
the same version of branch-0.20-append then I do not need to replace any 
files anywhere.  However, since nobody is telling me the version of 
branch-0.20-append from which HBase's hadoop-core.jar was built, I can not 
in any case or way be confident that my cluster is running EXACTLY the 
same version even if it is branch-0.20-append.  So the net result is that 
in all distributed cases (except when I import pre-built Hadoop+HBase from 
Cloudera or elsewhere) I have to build branch-0.20-append and copy it's 
core JAR into my HBase lib.  Have I got this right?  The book still does 
not say that clearly.  In fact, the book still points to my old email 
saying I did it the other way around.  Your reply above clearly seems to 
imply that I need to replace HBase's hadoop core JAR only in some 
distributed cases.  Yet the rest of the email conversation on this point 
seems to have settled that HBase's hadoop core JAR needs to be replaced in 
all distributed cases.

Thanks,
Mike

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by stack <sa...@gmail.com>.
On Mon, Jun 6, 2011 at 6:49 PM, Mike Spreitzer <ms...@us.ibm.com> wrote:
> Where is that citation of Michael Noll's nicely detailed instruction on how
> to build the append branch?
>

See Section 1.3.1.2 here
http://hbase.apache.org/book/notsoquick.html#requirements.  Look for
"Michael Noll has written a detailed blog, Building an Hadoop 0.20.x
version for HBase 0.90.2, on how to build an Hadoop from
branch-0.20-append. Recommended."

> Why does hbase include a hadoop-core.jar?  The instructions say I should
> replace it, so why am I given it in the first place?
>

You have to replace it if you are running on an hadoop that is other
than an exact match to the jar we ship with (If you are doing
standalone mode or if you are running unit tests, the jar is needed
since we have a bunch of Hadoop dependencies from our Configuration to
UI to MapReduce to Connection to HDFS etc.)

Again, I apologize for the fact that this is less-than smooth sailing.
 HBase project is in a bit of an awkward spot.  We're trying to put
the best face on it.  If you have any suggestions for how best we
might this, we are all ears.

Yours,
St.Ack

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Mike Spreitzer <ms...@us.ibm.com>.
Where is that citation of Michael Noll's nicely detailed instruction on 
how to build the append branch?

Why does hbase include a hadoop-core.jar?  The instructions say I should 
replace it, so why am I given it in the first place?

Thanks,
Mike Spreitzer



From:   Stack <st...@duboce.net>
To:     user@hbase.apache.org
Date:   06/06/2011 03:40 PM
Subject:        Re: Hadoop not working after replacing hadoop-core.jar 
with hadoop-core-append.jar
Sent by:        saint.ack@gmail.com



On Mon, Jun 6, 2011 at 11:24 AM, Joe Pallas <jo...@oracle.com> 
wrote:
> Hi St.Ack.  Here is the sense in which the book leads a new user to the 
route that Mike (and I) took.  It seems to say this:
>
> <paraphrase>
> You have a choice.  You can download the source for the append branch of 
hadoop and build it yourself from scratch, which will take who knows how 
long and require additional tools and may not work on your preferred 
development platform (see <http://search-hadoop.com/m/8Efvi1EEiaf>, which 
says "Building sucks"), or you can take this shortcut that seems to work, 
but has no guarantees.  What you cannot do is find a pre-built release of 
the append branch anywhere for you to just download and use.  Your call.
> </paraphrase>
>
> Now, maybe that isn't the message you actually intend.

Its not.  In particular, the "...which will take who knows how long
and require additional tools and may not work on your preferred
development platform" bit.  Michael Noll has written up a nicely
detailed instruction on how to build the append branch.  Its cited up
front in our doc.  Is it not helpful?  I'd think that readers would
give this posting more credence than a "Building sucks" comment made
by our Ryan, HBase's (proud!) Statler and Waldorf combined [1].

The 'shortcut' will work its just that folks normally go the opposite
direction to that of your Michael; they copy their cluster's hadoop
jars into hbase rather than hbase's hadoop jar to the cluster.  I'm
guessing that Michael went this route because he would avoid CDH?
(Is that right?)



> Would it be some sort of horrible Apache faux pas for the HBase project 
to distribute its own release of the version of Hadoop that is required by 
HBase?

This came up recently over in hadoop.  HBasers were pitching to host a
build of the append branch over in hbase-land.  It was found that if
we did such a thing, we'd have to call it something other than Hadoop;
only  Hadoop can host Hadoop releases.  We didn't want to add yet more
confusion to an already torturous landscape so we passed on it.

> Because the Hadoop project isn't likely to do it, apparently, and, if I 
understand correctly, HBase is not going to work anytime soon with the 
next Hadoop release that has append support.  So this is not a problem 
that is going to fix itself.
>

HBase will work with the next Hadoop release, 0.22.0, when it comes
out [2].  The current state of things is temporary (I believe).  Sorry
for the inconvenience.

Thanks for the above input.  Our hadoop section needs updating now
there are yet more versions afoot.  The above will help when we recast
this section.

St.Ack

1.  http://muppet.wikia.com/wiki/Statler_and_Waldorf
2. https://issues.apache.org/jira/browse/HBASE-2233


Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by praveenesh kumar <pr...@gmail.com>.
Guys.. just adding to the conversation..I also think that about this
hadoop-append version thing .. I think it should be mentioned in the hadoop
website also..
http://hadoop.apache.org/common/releases.html
There is no mention about hadoop-append release and anything about HDFS that
has a durable sync
Newbies like me .. who first try to learn Hadoop.. either  goes for the
stable version ( e.g. hadoop 0.20.2) or the latest version.. thinking that
it might have covered most of the issues..

And once we get comfortable with it. and put it on our cluster.. and try to
move on to Hbase..then we realized that we need another version to work on
hbase...for which we try to go some other work arounds like this. which
results in emails like this.

Thanks,
Praveenesh

On Tue, Jun 7, 2011 at 5:06 AM, Joe Pallas <jo...@oracle.com> wrote:

>
> On Jun 6, 2011, at 12:36 PM, Stack wrote:
>
> > In particular, the "...which will take who knows how long
> > and require additional tools and may not work on your preferred
> > development platform" bit.  Michael Noll has written up a nicely
> > detailed instruction on how to build the append branch.  Its cited up
> > front in our doc.  Is it not helpful?  I'd think that readers would
> > give this posting more credence than a "Building sucks" comment made
> > by our Ryan, HBase's (proud!) Statler and Waldorf combined [1].
>
> I agree that Michael's instructions look a lot friendlier.  It's been a
> while since I looked at this part of the book and it is much less daunting
> now.
>
> > HBase will work with the next Hadoop release, 0.22.0, when it comes
> > out [2].  The current state of things is temporary (I believe).  Sorry
> > for the inconvenience.
>
> Good to know.  I had somehow gotten the idea that there was a compatibility
> issue with 0.22 that might not get resolved, but I must have been confused.
>  Or maybe there was a question about whether Hadoop 0.22 would stabilize
> before the HBase 0.92 branch?  In any case, I apologize for making things
> sound worse than they are.
>
> joe
>
>

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Stack <st...@duboce.net>.
On Mon, Jun 6, 2011 at 4:36 PM, Joe Pallas <jo...@oracle.com> wrote:
> Good to know.  I had somehow gotten the idea that there was a compatibility issue with 0.22 that might not get resolved, but I must have been confused.  Or maybe there was a question about whether Hadoop 0.22 would stabilize before the HBase 0.92 branch?  In any case, I apologize for making things sound worse than they are.
>

No worries.  Keep the feedback coming.  The hadoop/hbase story is
snakey and will be for another little while.
St.Ack

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Joe Pallas <jo...@oracle.com>.
On Jun 6, 2011, at 12:36 PM, Stack wrote:

> In particular, the "...which will take who knows how long
> and require additional tools and may not work on your preferred
> development platform" bit.  Michael Noll has written up a nicely
> detailed instruction on how to build the append branch.  Its cited up
> front in our doc.  Is it not helpful?  I'd think that readers would
> give this posting more credence than a "Building sucks" comment made
> by our Ryan, HBase's (proud!) Statler and Waldorf combined [1].

I agree that Michael's instructions look a lot friendlier.  It's been a while since I looked at this part of the book and it is much less daunting now.

> HBase will work with the next Hadoop release, 0.22.0, when it comes
> out [2].  The current state of things is temporary (I believe).  Sorry
> for the inconvenience.

Good to know.  I had somehow gotten the idea that there was a compatibility issue with 0.22 that might not get resolved, but I must have been confused.  Or maybe there was a question about whether Hadoop 0.22 would stabilize before the HBase 0.92 branch?  In any case, I apologize for making things sound worse than they are.

joe


Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Stack <st...@duboce.net>.
On Mon, Jun 6, 2011 at 11:24 AM, Joe Pallas <jo...@oracle.com> wrote:
> Hi St.Ack.  Here is the sense in which the book leads a new user to the route that Mike (and I) took.  It seems to say this:
>
> <paraphrase>
> You have a choice.  You can download the source for the append branch of hadoop and build it yourself from scratch, which will take who knows how long and require additional tools and may not work on your preferred development platform (see <http://search-hadoop.com/m/8Efvi1EEiaf>, which says "Building sucks"), or you can take this shortcut that seems to work, but has no guarantees.  What you cannot do is find a pre-built release of the append branch anywhere for you to just download and use.  Your call.
> </paraphrase>
>
> Now, maybe that isn't the message you actually intend.

Its not.  In particular, the "...which will take who knows how long
and require additional tools and may not work on your preferred
development platform" bit.  Michael Noll has written up a nicely
detailed instruction on how to build the append branch.  Its cited up
front in our doc.  Is it not helpful?  I'd think that readers would
give this posting more credence than a "Building sucks" comment made
by our Ryan, HBase's (proud!) Statler and Waldorf combined [1].

The 'shortcut' will work its just that folks normally go the opposite
direction to that of your Michael; they copy their cluster's hadoop
jars into hbase rather than hbase's hadoop jar to the cluster.  I'm
guessing that Michael went this route because he would avoid CDH?
(Is that right?)



> Would it be some sort of horrible Apache faux pas for the HBase project to distribute its own release of the version of Hadoop that is required by HBase?

This came up recently over in hadoop.  HBasers were pitching to host a
build of the append branch over in hbase-land.  It was found that if
we did such a thing, we'd have to call it something other than Hadoop;
only  Hadoop can host Hadoop releases.  We didn't want to add yet more
confusion to an already torturous landscape so we passed on it.

> Because the Hadoop project isn't likely to do it, apparently, and, if I understand correctly, HBase is not going to work anytime soon with the next Hadoop release that has append support.  So this is not a problem that is going to fix itself.
>

HBase will work with the next Hadoop release, 0.22.0, when it comes
out [2].  The current state of things is temporary (I believe).  Sorry
for the inconvenience.

Thanks for the above input.  Our hadoop section needs updating now
there are yet more versions afoot.  The above will help when we recast
this section.

St.Ack

1.  http://muppet.wikia.com/wiki/Statler_and_Waldorf
2. https://issues.apache.org/jira/browse/HBASE-2233

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Joe Pallas <jo...@oracle.com>.
On Jun 6, 2011, at 8:45 AM, Stack wrote:

> On Mon, Jun 6, 2011 at 7:18 AM, Mike Spreitzer <ms...@us.ibm.com> wrote:
>> My latest information (not from me, from actual experts) says it is NOT
>> the right approach.  Look further into that discussion thread.  I do not
>> understand why (http://hbase.apache.org/notsoquick.html#hadoop) still
>> points at that misleading message.
>> 
> 
> Let us know how we should improve it Mike.  That mail thread is
> mentioned as an addendum, in a local FAQ-like section, that you could
> go the route that you went at one time (as outlined by the thread) but
> then its not usual as is said in the thread.
> 
> The book, by my reading, does not lead you to the route you took but
> hey, I'm blind to a new-users' reading having long ago lost my noob
> eyes.  Thats why its helpful when new users volunteer improvements to
> the text.

Hi St.Ack.  Here is the sense in which the book leads a new user to the route that Mike (and I) took.  It seems to say this:

<paraphrase>
You have a choice.  You can download the source for the append branch of hadoop and build it yourself from scratch, which will take who knows how long and require additional tools and may not work on your preferred development platform (see <http://search-hadoop.com/m/8Efvi1EEiaf>, which says "Building sucks"), or you can take this shortcut that seems to work, but has no guarantees.  What you cannot do is find a pre-built release of the append branch anywhere for you to just download and use.  Your call.
</paraphrase>

Now, maybe that isn't the message you actually intend.  But, to be honest, I think that accurately describes reality, based on the mailing list discussions I've seen.  As a user, what I see coming from the HBase project is, "Look!  Here's a release!" but it doesn't include all the bits you need and you can't simply go somewhere and fetch those needed bits.  Ryan wrote <http://search-hadoop.com/m/fM0PrEEiaf>:

> It's really confusing, but the basic fact is there is no ASF released
> version of hadoop that runs HBase properly. My best suggestion is to
> complain to general@, and file JIRAs if you can. It helps when users
> complain, since I think everyone has gone tone deaf from me
> complaining :-)

Would it be some sort of horrible Apache faux pas for the HBase project to distribute its own release of the version of Hadoop that is required by HBase?  Because the Hadoop project isn't likely to do it, apparently, and, if I understand correctly, HBase is not going to work anytime soon with the next Hadoop release that has append support.  So this is not a problem that is going to fix itself.

joe


Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Stack <st...@duboce.net>.
On Mon, Jun 6, 2011 at 7:18 AM, Mike Spreitzer <ms...@us.ibm.com> wrote:
> My latest information (not from me, from actual experts) says it is NOT
> the right approach.  Look further into that discussion thread.  I do not
> understand why (http://hbase.apache.org/notsoquick.html#hadoop) still
> points at that misleading message.
>

Let us know how we should improve it Mike.  That mail thread is
mentioned as an addendum, in a local FAQ-like section, that you could
go the route that you went at one time (as outlined by the thread) but
then its not usual as is said in the thread.

The book, by my reading, does not lead you to the route you took but
hey, I'm blind to a new-users' reading having long ago lost my noob
eyes.  Thats why its helpful when new users volunteer improvements to
the text.

Thanks,
St.Ack

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Stack <st...@duboce.net>.
On Mon, Jun 6, 2011 at 7:18 AM, Mike Spreitzer <ms...@us.ibm.com> wrote:
> My latest information (not from me, from actual experts) says it is NOT
> the right approach.  Look further into that discussion thread.  I do not
> understand why (http://hbase.apache.org/notsoquick.html#hadoop) still
> points at that misleading message.
>

Let us know how we should improve it Mike.  That mail thread is
mentioned as an addendum, in a local FAQ-like section, that you could
go the route that you went at one time (as outlined by the thread) but
then its not usual as is said in the thread.

The book, by my reading, does not lead you to the route you took but
hey, I'm blind to a new-users' reading having long ago lost my noob
eyes.  Thats why its helpful when new users volunteer improvements to
the text.

Thanks,
St.Ack

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Mike Spreitzer <ms...@us.ibm.com>.
My latest information (not from me, from actual experts) says it is NOT 
the right approach.  Look further into that discussion thread.  I do not 
understand why (http://hbase.apache.org/notsoquick.html#hadoop) still 
points at that misleading message.

Regards,
Mike Spreitzer




From:   praveenesh kumar <pr...@gmail.com>
To:     common-user@hadoop.apache.org, user@hbase.apache.org
Date:   06/06/2011 09:24 AM
Subject:        Re: Hadoop not working after replacing hadoop-core.jar 
with hadoop-core-append.jar



Hello guys..

Changing the name of the hadoop-apppend-core.jar file to
hadoop-0.20.2-core.jar did the trick..
Its working now..
But is this the right solution to this problem ??

Thanks,
Praveenesh

On Mon, Jun 6, 2011 at 2:18 PM, praveenesh kumar 
<pr...@gmail.com>wrote:

>
> Hi,
>
> Not able to see my email in the mail archive..So sending it again...!!!
> Guys.. need your feedback..!!
>
> Thanks,
> Praveenesh
> ---------- Forwarded message ----------
> From: praveenesh kumar <pr...@gmail.com>
> Date: Mon, Jun 6, 2011 at 12:09 PM
> Subject: Hadoop is not working after adding
> hadoop-core-0.20-append-r1056497.jar
> To: common-user@hadoop.apache.org, user@hbase.apache.org
>
>
> Hello guys..!!!
>
> I am currently working on Hbase 0.90.3 and Hadoop 0.20.2
>
> Since this hadoop version does not support rsync hdfs..
> so I copied the *hadoop-core-append jar*  file from *hbase/lib* folder
> into* hadoop folder* and replaced it with* hadoop-0.20.2-core.jar*
> which was suggested in the following link
>
>
> 
http://www.apacheserver.net/Using-Hadoop-bundled-in-lib-directory-HBase-at1136240.htm

>
> I guess this is what have been mentioned in the link that I am doing. If 
I
> am doing somehting wrong, kindly tell me.
>
> But now after adding that jar file.. I am not able to run my hadoop.. I 
am
> getting following exception messages on my screen
>
> ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/util/PlatformName
> ub13: Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.util.PlatformName
> ub13:   at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
> ub13:   at java.security.AccessController.doPrivileged(Native Method)
> ub13:   at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
> ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
> ub13:   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
> ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
> ub13: Could not find the main class: 
org.apache.hadoop.util.PlatformName.
> Program will exit.
> ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/hdfs/server/datanode/DataNode
> ub13: starting secondarynamenode, logging to
> 
/usr/local/hadoop/hadoop/bin/../logs/hadoop-hadoop-secondarynamenode-ub13.out
> ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/util/PlatformName
> ub13: Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.util.PlatformName
> ub13:   at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
> ub13:   at java.security.AccessController.doPrivileged(Native Method)
> ub13:   at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
> ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
> ub13:   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
> ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
> ub13: Could not find the main class: 
org.apache.hadoop.util.PlatformName.
> Program will exit.
> ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/hdfs/server/namenode/SecondaryNameNode
> Have I done something wrong.. Please guide me...!!
>
> Thanks,
> Praveenesh
>
>


Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Mike Spreitzer <ms...@us.ibm.com>.
My latest information (not from me, from actual experts) says it is NOT 
the right approach.  Look further into that discussion thread.  I do not 
understand why (http://hbase.apache.org/notsoquick.html#hadoop) still 
points at that misleading message.

Regards,
Mike Spreitzer




From:   praveenesh kumar <pr...@gmail.com>
To:     common-user@hadoop.apache.org, user@hbase.apache.org
Date:   06/06/2011 09:24 AM
Subject:        Re: Hadoop not working after replacing hadoop-core.jar 
with hadoop-core-append.jar



Hello guys..

Changing the name of the hadoop-apppend-core.jar file to
hadoop-0.20.2-core.jar did the trick..
Its working now..
But is this the right solution to this problem ??

Thanks,
Praveenesh

On Mon, Jun 6, 2011 at 2:18 PM, praveenesh kumar 
<pr...@gmail.com>wrote:

>
> Hi,
>
> Not able to see my email in the mail archive..So sending it again...!!!
> Guys.. need your feedback..!!
>
> Thanks,
> Praveenesh
> ---------- Forwarded message ----------
> From: praveenesh kumar <pr...@gmail.com>
> Date: Mon, Jun 6, 2011 at 12:09 PM
> Subject: Hadoop is not working after adding
> hadoop-core-0.20-append-r1056497.jar
> To: common-user@hadoop.apache.org, user@hbase.apache.org
>
>
> Hello guys..!!!
>
> I am currently working on Hbase 0.90.3 and Hadoop 0.20.2
>
> Since this hadoop version does not support rsync hdfs..
> so I copied the *hadoop-core-append jar*  file from *hbase/lib* folder
> into* hadoop folder* and replaced it with* hadoop-0.20.2-core.jar*
> which was suggested in the following link
>
>
> 
http://www.apacheserver.net/Using-Hadoop-bundled-in-lib-directory-HBase-at1136240.htm

>
> I guess this is what have been mentioned in the link that I am doing. If 
I
> am doing somehting wrong, kindly tell me.
>
> But now after adding that jar file.. I am not able to run my hadoop.. I 
am
> getting following exception messages on my screen
>
> ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/util/PlatformName
> ub13: Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.util.PlatformName
> ub13:   at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
> ub13:   at java.security.AccessController.doPrivileged(Native Method)
> ub13:   at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
> ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
> ub13:   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
> ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
> ub13: Could not find the main class: 
org.apache.hadoop.util.PlatformName.
> Program will exit.
> ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/hdfs/server/datanode/DataNode
> ub13: starting secondarynamenode, logging to
> 
/usr/local/hadoop/hadoop/bin/../logs/hadoop-hadoop-secondarynamenode-ub13.out
> ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/util/PlatformName
> ub13: Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.util.PlatformName
> ub13:   at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
> ub13:   at java.security.AccessController.doPrivileged(Native Method)
> ub13:   at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
> ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
> ub13:   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
> ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
> ub13: Could not find the main class: 
org.apache.hadoop.util.PlatformName.
> Program will exit.
> ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/hdfs/server/namenode/SecondaryNameNode
> Have I done something wrong.. Please guide me...!!
>
> Thanks,
> Praveenesh
>
>


Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by praveenesh kumar <pr...@gmail.com>.
Hello guys..

Changing the name of the hadoop-apppend-core.jar file to
hadoop-0.20.2-core.jar did the trick..
Its working now..
But is this the right solution to this problem ??

Thanks,
Praveenesh

On Mon, Jun 6, 2011 at 2:18 PM, praveenesh kumar <pr...@gmail.com>wrote:

>
> Hi,
>
> Not able to see my email in the mail archive..So sending it again...!!!
> Guys.. need your feedback..!!
>
> Thanks,
> Praveenesh
> ---------- Forwarded message ----------
> From: praveenesh kumar <pr...@gmail.com>
> Date: Mon, Jun 6, 2011 at 12:09 PM
> Subject: Hadoop is not working after adding
> hadoop-core-0.20-append-r1056497.jar
> To: common-user@hadoop.apache.org, user@hbase.apache.org
>
>
> Hello guys..!!!
>
> I am currently working on Hbase 0.90.3 and Hadoop 0.20.2
>
> Since this hadoop version does not support rsync hdfs..
> so I copied the *hadoop-core-append jar*  file from *hbase/lib* folder
> into* hadoop folder* and replaced it with* hadoop-0.20.2-core.jar*
> which was suggested in the following link
>
>
> http://www.apacheserver.net/Using-Hadoop-bundled-in-lib-directory-HBase-at1136240.htm
>
> I guess this is what have been mentioned in the link that I am doing. If I
> am doing somehting wrong, kindly tell me.
>
> But now after adding that jar file.. I am not able to run my hadoop.. I am
> getting following exception messages on my screen
>
> ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/util/PlatformName
> ub13: Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.util.PlatformName
> ub13:   at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
> ub13:   at java.security.AccessController.doPrivileged(Native Method)
> ub13:   at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
> ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
> ub13:   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
> ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
> ub13: Could not find the main class: org.apache.hadoop.util.PlatformName.
> Program will exit.
> ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/hdfs/server/datanode/DataNode
> ub13: starting secondarynamenode, logging to
> /usr/local/hadoop/hadoop/bin/../logs/hadoop-hadoop-secondarynamenode-ub13.out
> ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/util/PlatformName
> ub13: Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.util.PlatformName
> ub13:   at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
> ub13:   at java.security.AccessController.doPrivileged(Native Method)
> ub13:   at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
> ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
> ub13:   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
> ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
> ub13: Could not find the main class: org.apache.hadoop.util.PlatformName.
> Program will exit.
> ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/hdfs/server/namenode/SecondaryNameNode
> Have I done something wrong.. Please guide me...!!
>
> Thanks,
> Praveenesh
>
>

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by praveenesh kumar <pr...@gmail.com>.
Hello guys..

Changing the name of the hadoop-apppend-core.jar file to
hadoop-0.20.2-core.jar did the trick..
Its working now..
But is this the right solution to this problem ??

Thanks,
Praveenesh

On Mon, Jun 6, 2011 at 2:18 PM, praveenesh kumar <pr...@gmail.com>wrote:

>
> Hi,
>
> Not able to see my email in the mail archive..So sending it again...!!!
> Guys.. need your feedback..!!
>
> Thanks,
> Praveenesh
> ---------- Forwarded message ----------
> From: praveenesh kumar <pr...@gmail.com>
> Date: Mon, Jun 6, 2011 at 12:09 PM
> Subject: Hadoop is not working after adding
> hadoop-core-0.20-append-r1056497.jar
> To: common-user@hadoop.apache.org, user@hbase.apache.org
>
>
> Hello guys..!!!
>
> I am currently working on Hbase 0.90.3 and Hadoop 0.20.2
>
> Since this hadoop version does not support rsync hdfs..
> so I copied the *hadoop-core-append jar*  file from *hbase/lib* folder
> into* hadoop folder* and replaced it with* hadoop-0.20.2-core.jar*
> which was suggested in the following link
>
>
> http://www.apacheserver.net/Using-Hadoop-bundled-in-lib-directory-HBase-at1136240.htm
>
> I guess this is what have been mentioned in the link that I am doing. If I
> am doing somehting wrong, kindly tell me.
>
> But now after adding that jar file.. I am not able to run my hadoop.. I am
> getting following exception messages on my screen
>
> ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/util/PlatformName
> ub13: Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.util.PlatformName
> ub13:   at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
> ub13:   at java.security.AccessController.doPrivileged(Native Method)
> ub13:   at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
> ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
> ub13:   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
> ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
> ub13: Could not find the main class: org.apache.hadoop.util.PlatformName.
> Program will exit.
> ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/hdfs/server/datanode/DataNode
> ub13: starting secondarynamenode, logging to
> /usr/local/hadoop/hadoop/bin/../logs/hadoop-hadoop-secondarynamenode-ub13.out
> ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/util/PlatformName
> ub13: Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.util.PlatformName
> ub13:   at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
> ub13:   at java.security.AccessController.doPrivileged(Native Method)
> ub13:   at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
> ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
> ub13:   at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
> ub13:   at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
> ub13: Could not find the main class: org.apache.hadoop.util.PlatformName.
> Program will exit.
> ub13: Exception in thread "main" java.lang.NoClassDefFoundError:
> org/apache/hadoop/hdfs/server/namenode/SecondaryNameNode
> Have I done something wrong.. Please guide me...!!
>
> Thanks,
> Praveenesh
>
>

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by praveenesh kumar <pr...@gmail.com>.
Stack..

Sorry for the confusion.. I was not able to see my email in the archive..
thought like my email didn't reached properly.. so thats why I send it
again..anyways will take care in future..!!

Regards,
Praveenesh

On Mon, Jun 6, 2011 at 8:58 PM, Stack <st...@duboce.net> wrote:

> Praveenesh:
>
> Please stop mailing hadoop common-user AND hbase user lists.  Mail one
> or the other.
>
>
> On Mon, Jun 6, 2011 at 1:48 AM, praveenesh kumar <pr...@gmail.com>
> wrote:
> > Hi,
> >
> > Not able to see my email in the mail archive..So sending it again...!!!
> >
>
> What brings on the exclamation marks?  Are we not responding to you fast
> enough?
>
> St.Ack
>

Re: Hadoop not working after replacing hadoop-core.jar with hadoop-core-append.jar

Posted by Stack <st...@duboce.net>.
Praveenesh:

Please stop mailing hadoop common-user AND hbase user lists.  Mail one
or the other.


On Mon, Jun 6, 2011 at 1:48 AM, praveenesh kumar <pr...@gmail.com> wrote:
> Hi,
>
> Not able to see my email in the mail archive..So sending it again...!!!
>

What brings on the exclamation marks?  Are we not responding to you fast enough?

St.Ack