You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by stchu <st...@gmail.com> on 2009/07/22 04:10:59 UTC

org.apache.hadoop.hbase.client.RetriesExhaustedException

Hi,

Recently I try to import HDFS text files into HBase. The map function read
each line (record) in the files and calculate the index of this record.
The map output is set as: <Key, Value>= <index+"\t"+columnName, record>.
TableReduce is used for reduce function which combine the record in
value.iterator into a String with "," splits and then collect to output. The
map process can complete without any exception or warning. But
during the reduce process, several tasks response the exception as:

===============================================================================================================================================================
org.apache.hadoop.hbase.client.RetriesExhaustedException: Trying to
contact region server Some server for region
TestIndexP,,1248173466044, row '-0.0001_38.8370', but failed after 10
attempts.
Exceptions:

	at org.apache.hadoop.hbase.client.HConnectionManager$TableServers.processBatchOfRows(HConnectionManager.java:961)
	at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1397)
	at org.apache.hadoop.hbase.client.HTable.commit(HTable.java:1341)
	at org.apache.hadoop.hbase.client.HTable.commit(HTable.java:1321)
	at icl.atc.ites.hbase.PIndexCreator$TableReducer.reduce(PIndexCreator.java:320)
	at icl.atc.ites.hbase.PIndexCreator$TableReducer.reduce(PIndexCreator.java:258)
	at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:436)
	at org.apache.hadoop.mapred.Child.main(Child.java:158)

================================================================================================================================================================

The job failed finally. 4 machine (1 master+3 slaves) cluster we used and
the size of the source data is more than 10 GB with about 3.3 billions rows.
The size of reduce input is about 3 times of map input. I used Hadoop 0.19.1
and Hbase 0.19.3. And I tried 12 and 53 as the numReduceTask but both
failed. Could anyone give me a help? Thanks a lot.

stchu

Re: org.apache.hadoop.hbase.client.RetriesExhaustedException

Posted by Erik Holstad <er...@gmail.com>.
You are welcome!
Good luck and let us know if you have some more issues.

Regards Erik

Re: org.apache.hadoop.hbase.client.RetriesExhaustedException

Posted by stchu <st...@gmail.com>.
Hi, Erik,

Thanks for your suggestion. I tried to import some subsets of my data.
The subsets are 350MB, 995MB, 2.4GB and 4GB. The first two are completed
without any exception but the latter two fail with the same exceptions.
I will try to do these jobs on a larger cluster, thanks a lot!

stchu


2009/7/23 Erik Holstad <er...@gmail.com>

> Hi Stchu!
> To me it looks like you are overloading the system and that your server
> goes
> down or becomes unreachable.
> Is this just for testing and in that case maybe you can make the test
> smaller, or even better make you cluster
> bigger, if you have that option.
>
> Regards Erik
>

Re: org.apache.hadoop.hbase.client.RetriesExhaustedException

Posted by Erik Holstad <er...@gmail.com>.
Hi Stchu!
To me it looks like you are overloading the system and that your server goes
down or becomes unreachable.
Is this just for testing and in that case maybe you can make the test
smaller, or even better make you cluster
bigger, if you have that option.

Regards Erik