You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by "Taylor, Ronald C" <ro...@pnl.gov> on 2009/04/03 22:04:33 UTC

RE: Novice Hbase user - Hbase restart problem solved

 
Hi St. Ack, Erik,

Thanks very much for the help. I now have Hbase back up and running. I
actually completely deleted the HDFS directory, and reformatted from
scratch. I also deleted everything pertaining to Hadoop and Hbase in the
/tmp directory before doing a new invocation, as Erik suggested. 

However, my best guess is to what happened is that I started Hbase from
a very old shell, opened before I added the environment var HBASE_HOME
to the .mycshrc file. I checked, and thet newer version of the .mycshrc
file never got sourced in that shell. Probably that was the big
"whoops". Anyway, thanks again for the tips - certainly will be useful
to note for future use.

Now I'll get back to work and see if I can get around my bulk import
problem, using Ryan's doCommit() method.

 Ron

___________________________________________
Ronald Taylor, Ph.D.
Computational Biology & Bioinformatics Group
Pacific Northwest National Laboratory
902 Battelle Boulevard
P.O. Box 999, MSIN K7-90
Richland, WA  99352 USA
Office:  509-372-6568
Email: ronald.taylor@pnl.gov
www.pnl.gov

-----Original Message-----
From: saint.ack@gmail.com [mailto:saint.ack@gmail.com] On Behalf Of
stack
Sent: Thursday, April 02, 2009 12:19 AM
To: hbase-user@hadoop.apache.org
Subject: Re: Novice Hbase user needs more help

It doesn't look like your format actually reformatted because it seems
to have left around the hbase directory in hdfs -- but probably minus
its content (The rootdir is there but not the hbase.version file that
hbase writes on bootstrap).  Try removing it: e.g. ./bin/hadoop fs -rmr
$HBASE_HOMEDIR.  Then try restarting hbase.

St.Ack


On Thu, Apr 2, 2009 at 3:45 AM, Taylor, Ronald C
<ro...@pnl.gov>wrote:

>
> Hello Erik,
>
> Thanks for the info. Unfortunately, at the moment I seem to have 
> regressed to the point where I can't even bring up Hbase. That is, I 
> decided to clean things out before doing more work, and so I tried to 
> do a restart: Hbase was already down due to the malfunctioning of my 
> Java program, but I made sure by issuing a stop-hbase.sh command. I 
> then issued a stop-dfs.sh command to Hadoop, which worked OK. I then 
> did a format using
>
>   bin/hadoop namenode -format
>
>  and restarted Hadoop. That also appeared to work OK, according to the

> Hadoop log files.
>
> But when I try to restart Hbase (which, as I said, had aborted due to 
> the previously mentioned problem in my Java upload program), I get the

> error msgs below in the Hbase log file. There is a file named
>
>  /hbase/hbase.version
>
> that the error msgs talk about. I cannot remember seeing or setting up

> any such file when I first installed Hbase, and I cannot find it in 
> the Hbase subdirirectories now. All I had to do when I installed Hbase

> was make very minor mods to two of the files in the .../conf subdir, 
> so I am at a loss as to why this hbase.version file is now required 
> and why a block for it cannot be found.
>
> And I cannot find anything in the docs on this "hbase.version" file, 
> or anything else that might be helpful in this context of the error 
> msgs on "No live nodes contain current block". I would deeply 
> appreciate any help at this point, just to get Hbase back up and
running.
>
>  Ron
>
> %%%%%%%%%%%%%%%%%%
>
> Error msgs from the Hbase log file when I tried to do a startup:
>
> Wed Apr  1 18:10:55 PDT 2009 Starting master on sidney ulimit -n 1024
> 2009-04-01 18:10:55,856 INFO org.apache.hadoop.hbase.master.HMaster:
> vmName=Java HotSpot(TM) Server VM, vmVendor=Sun Microsystems Inc.,
> vmVersion=11.0-b16
>
> 2009-04-01 18:10:55,857 INFO org.apache.hadoop.hbase.master.HMaster:
> vmInputArguments=[-Xmx1000m, -XX:+HeapDumpOnOutOfMemoryError, 
> -Dhbase.log.dir=/sid/Hbase/hbase-0.19.0/bin/../logs,
> -Dhbase.log.file=hbase-hadoop-master-sidney.log,
> -Dhbase.home.dir=/sid/Hbase/hbase-0.19.0/bin/.., 
> -Dhbase.id.str=hadoop, -Dhbase.root.logger=INFO,DRFA,
> -Djava.library.path=/sid/Hbase/hbase-0.19.0/bin/../lib/native/Linux-i3
> 86
> -32]
>
> 2009-04-01 18:10:56,187 INFO org.apache.hadoop.hbase.master.HMaster:
> Root region dir: hdfs://localhost:5302/hbase/-ROOT-/70236052
> 2009-04-01 18:10:56,201 INFO org.apache.hadoop.hdfs.DFSClient: Could 
> not obtain block blk_6227212323375236304_1002 from any node:
> java.io.IOException: No live nodes contain current block
> 2009-04-01 18:10:59,205 INFO org.apache.hadoop.hdfs.DFSClient: Could 
> not obtain block blk_6227212323375236304_1002 from any node:
> java.io.IOException: No live nodes contain current block
> 2009-04-01 18:11:02,208 INFO org.apache.hadoop.hdfs.DFSClient: Could 
> not obtain block blk_6227212323375236304_1002 from any node:
> java.io.IOException: No live nodes contain current block
>
> 2009-04-01 18:11:05,212 WARN org.apache.hadoop.hdfs.DFSClient: DFS
Read:
> java.io.IOException: Could not obtain block:
> blk_6227212323375236304_1002 file=/hbase/hbase.version
>        at
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClie
> nt
> .java:1708)
>        at
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.
> ja
> va:1536)
>        at
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:16
> 63
> )
>        at
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:15
> 93
> )
>        at java.io.DataInputStream.readUnsignedShort(Unknown Source)
>        at java.io.DataInputStream.readUTF(Unknown Source)
>        at
> org.apache.hadoop.hbase.util.FSUtils.getVersion(FSUtils.java:101)
>        at
> org.apache.hadoop.hbase.util.FSUtils.checkVersion(FSUtils.java:120)
>        at
> org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:211)
>        at
> org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:155)
>        at
>
org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:
> 96)
>        at
>
org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:
> 78)
>        at
> org.apache.hadoop.hbase.master.HMaster.doMain(HMaster.java:966)
>        at
> org.apache.hadoop.hbase.master.HMaster.main(HMaster.java:1010)
>
> ___________________________________________
> Ronald Taylor, Ph.D.
> Computational Biology & Bioinformatics Group Pacific Northwest 
> National Laboratory
> 902 Battelle Boulevard
> P.O. Box 999, MSIN K7-90
> Richland, WA  99352 USA
> Office:  509-372-6568
> Email: ronald.taylor@pnl.gov
> www.pnl.gov
>
> -----Original Message-----
> From: Erik Holstad [mailto:erikholstad@gmail.com]
> Sent: Wednesday, April 01, 2009 3:47 PM
> To: hbase-user@hadoop.apache.org
> Subject: Re: Novice Hbase user needs help with data upload - gets a 
> RetriesExhaustedException, followed by NoServerForRegionException
>
> Hi Ron!
> you can try to look at:
>
> http://wiki.apache.org/hadoop/Hbase/Troubleshooting#5 and 6
>
> http://hadoop.apache.org/hbase/docs/r0.19.0/api/overview-summary.html#
> ov
> erview_description
>
> Some similar problems can be found in:
>
> http://www.nabble.com/RetriesExhaustedException--for-TableReduce-td225
> 69 
> 113.html<http://www.nabble.com/RetriesExhaustedException--for-TableRed
> uce-td22569%0A113.html> 
> http://www.nabble.com/RetriesExhaustedException!-td22408156.html<http:
> //www.nabble.com/RetriesExhaustedException%21-td22408156.html>
>
> Hope that it can be of help
> Regards Erik
>