You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Vidosh Sahu <vi...@girnarsoft.com> on 2013/02/08 09:57:44 UTC

Getting RetriesExhaustedException while getting rows

Hi,

I have a pseudo-distributed setup of hbase/hadoop. I have 2 tables in hbase.

I have a scheduler which runs at some frequency. Scheduler kicks three
different mapper 1 by 1 (one the successful completion of the first one,
the next one starts).

When I ran it on lesser data it works fine, but in case of huge data it
throws the following exception -

org.apache.hadoop.hbase.client.RetriesExhaustedException: Trying to contact
region server null for region , row '*********************', but failed
after 10 attempts.
Exceptions:
java.io.IOException:
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
java.io.IOException:
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
java.io.IOException:
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
java.io.IOException:
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
java.io.IOException:
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
java.io.IOException:
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
java.io.IOException:
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
java.io.IOException:
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
java.io.IOException:
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
java.io.IOException:
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed

at
org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getRegionServerWithRetries(HConnectionManager.java:1017)
 at org.apache.hadoop.hbase.client.HTable.get(HTable.java:549)

Thanks,
Vidosh

Re: Getting RetriesExhaustedException while getting rows

Posted by Vidosh Sahu <vi...@girnarsoft.com>.
Thanks Ted. Upgrade to your suggested version with minor hbase-site.xml
modifications make it worked.

Thanks,
Vidosh

On Fri, Feb 8, 2013 at 11:30 PM, Ted Yu <yu...@gmail.com> wrote:

> 0.90.5 is so old. Can you upgrade (0.94.4) ?
>
> What did the log from 117.196.234.171 <http://117.196.234.171:60293/> look
> like ?
>
> Thanks
>
> On Fri, Feb 8, 2013 at 9:52 AM, Vidosh Sahu <vi...@girnarsoft.com> wrote:
>
> > Hi Ted,
> >
> > Thanks for the response.
> >
> > Hbase version - *0.90.5*
> > *
> > *
> > Here is the RS log -
> >
> > ##########################################
> > 2013-02-08 23:15:23,457 WARN org.apache.hadoop.ipc.HBaseServer: IPC
> Server
> > Responder, call multi(org.apache.hadoop.hbase.client.MultiAction@80ecfd)
> > from 117.196.234.171:60278: output error
> > 2013-02-08 23:15:23,457 WARN org.apache.hadoop.ipc.HBaseServer: IPC
> Server
> > handler 27 on 60020 caught: java.nio.channels.ClosedChannelException
> > at
> sun.nio.ch.SocketChannelImpl.ensureWriteOpen(SocketChannelImpl.java:249)
> >  at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:440)
> > at
> org.apache.hadoop.hbase.ipc.HBaseServer.channelIO(HBaseServer.java:1389)
> >  at
> >
> org.apache.hadoop.hbase.ipc.HBaseServer.channelWrite(HBaseServer.java:1341)
> > at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Responder.processResponse(HBaseServer.java:727)
> >  at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Responder.doRespond(HBaseServer.java:792)
> > at
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1083)
> >
> > 2013-02-08 23:17:20,153 WARN org.apache.hadoop.ipc.HBaseServer: IPC
> Server
> > Responder, call multi(org.apache.hadoop.hbase.client.MultiAction@13dead6
> )
> > from 117.196.234.171:60292: output error
> > 2013-02-08 23:17:20,153 WARN org.apache.hadoop.ipc.HBaseServer: IPC
> Server
> > handler 28 on 60020 caught: java.nio.channels.ClosedChannelException
> > at
> sun.nio.ch.SocketChannelImpl.ensureWriteOpen(SocketChannelImpl.java:249)
> >  at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:440)
> > at
> org.apache.hadoop.hbase.ipc.HBaseServer.channelIO(HBaseServer.java:1389)
> >  at
> >
> org.apache.hadoop.hbase.ipc.HBaseServer.channelWrite(HBaseServer.java:1341)
> > at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Responder.processResponse(HBaseServer.java:727)
> >  at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Responder.doRespond(HBaseServer.java:792)
> > at
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1083)
> >
> > 2013-02-08 23:18:23,343 DEBUG
> > org.apache.hadoop.hbase.io.hfile.LruBlockCache: LRU Stats: total=956.76
> KB,
> > free=193.95 MB, max=194.89 MB, blocks=10, accesses=837, hits=111,
> > hitRatio=13.26%%, cachingAccesses=121, cachingHits=111,
> > cachingHitsRatio=91.73%%, evictions=0, evicted=0, evictedPerRun=NaN
> > 2013-02-08 23:19:21,074 WARN org.apache.hadoop.ipc.HBaseServer: IPC
> Server
> > Responder, call multi(org.apache.hadoop.hbase.client.MultiAction@965fce)
> > from 117.196.234.171:60293: output error
> > 2013-02-08 23:19:21,074 WARN org.apache.hadoop.ipc.HBaseServer: IPC
> Server
> > handler 29 on 60020 caught: java.nio.channels.ClosedChannelException
> > at
> sun.nio.ch.SocketChannelImpl.ensureWriteOpen(SocketChannelImpl.java:249)
> >  at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:440)
> > at
> org.apache.hadoop.hbase.ipc.HBaseServer.channelIO(HBaseServer.java:1389)
> >  at
> >
> org.apache.hadoop.hbase.ipc.HBaseServer.channelWrite(HBaseServer.java:1341)
> > at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Responder.processResponse(HBaseServer.java:727)
> >  at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Responder.doRespond(HBaseServer.java:792)
> > at
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1083)
> >
> > ###########################################################
> >
> > Thanks,
> > Vidosh
> >
> > On Fri, Feb 8, 2013 at 10:58 PM, Ted Yu <yu...@gmail.com> wrote:
> >
> > > Can you tell us the version of HBase you're using ?
> > >
> > > If you can publish region server log around time of such error, that
> > would
> > > help too.
> > >
> > > Please use pastebin for log snippet.
> > >
> > > Thanks
> > >
> > > On Fri, Feb 8, 2013 at 12:57 AM, Vidosh Sahu <vi...@girnarsoft.com>
> > > wrote:
> > >
> > > > Hi,
> > > >
> > > > I have a pseudo-distributed setup of hbase/hadoop. I have 2 tables in
> > > > hbase.
> > > >
> > > > I have a scheduler which runs at some frequency. Scheduler kicks
> three
> > > > different mapper 1 by 1 (one the successful completion of the first
> > one,
> > > > the next one starts).
> > > >
> > > > When I ran it on lesser data it works fine, but in case of huge data
> it
> > > > throws the following exception -
> > > >
> > > > org.apache.hadoop.hbase.client.RetriesExhaustedException: Trying to
> > > contact
> > > > region server null for region , row '*********************', but
> failed
> > > > after 10 attempts.
> > > > Exceptions:
> > > > java.io.IOException:
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > > > java.io.IOException:
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > > > java.io.IOException:
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > > > java.io.IOException:
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > > > java.io.IOException:
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > > > java.io.IOException:
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > > > java.io.IOException:
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > > > java.io.IOException:
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > > > java.io.IOException:
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > > > java.io.IOException:
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > > >
> > > > at
> > > >
> > > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getRegionServerWithRetries(HConnectionManager.java:1017)
> > > >  at org.apache.hadoop.hbase.client.HTable.get(HTable.java:549)
> > > >
> > > > Thanks,
> > > > Vidosh
> > > >
> > >
> >
> >
> >
> > --
> > *Warm Regards
> >
> > Vidosh Sahu | Technical Team Lead | Girnar Software Pvt Ltd |*
> > *207 | Adarsh Nagar | Near Sardar Bhag Singh Ka Chouraha | Jaipur|
> 302004 |
> > India |
> > +91 141 422 4400 office | +91 9950751516 mobile | vidoshsahu skype | *
> > www.girnarSoft.com <http://www.girnarsoft.com/>
> > www.cardekho.com
> >
> > *Proud owners of www.CarDekho.com <http://www.cardekho.com/>, India's #1
> > auto portal!*
> > *
> > *
> > *
> > Confidentiality Notice & Legal Disclaimer: This E-Mail Message (including
> > attachments) may contain Confidential and/or legally privileged
> Information
> > and is meant for the intended recipient(s) only. If you have received
> this
> > e-mail in error and are not the intended recipient/s, kindly notify the
> > sender and then delete this e-mail immediately from your system. You are
> > also hereby notified that any use, any form of reproduction,
> dissemination,
> > copying, disclosure, modification, distribution and/or publication of
> this
> > e-mail, its contents or its attachment/s other than by its intended
> > recipient/s is strictly prohibited and may be unlawful.
> >
> > Internet Communications cannot be guaranteed to be secure or error-free
> as
> > information could be delayed, intercepted, corrupted, lost, or contain
> > viruses. Girnar does not accept any liability for any errors, omissions,
> > viruses or computer problems experienced by any recipient as a result of
> > this E-mail.
> > *
> >
>



-- 
*Warm Regards

Vidosh Sahu | Technical Team Lead | Girnar Software Pvt Ltd |*
*207 | Adarsh Nagar | Near Sardar Bhag Singh Ka Chouraha | Jaipur| 302004 |
India |
+91 141 422 4400 office | +91 9950751516 mobile | vidoshsahu skype | *
www.girnarSoft.com <http://www.girnarsoft.com/>
www.cardekho.com

*Proud owners of www.CarDekho.com <http://www.cardekho.com/>, India's #1
auto portal!*
*
*
*
Confidentiality Notice & Legal Disclaimer: This E-Mail Message (including
attachments) may contain Confidential and/or legally privileged Information
and is meant for the intended recipient(s) only. If you have received this
e-mail in error and are not the intended recipient/s, kindly notify the
sender and then delete this e-mail immediately from your system. You are
also hereby notified that any use, any form of reproduction, dissemination,
copying, disclosure, modification, distribution and/or publication of this
e-mail, its contents or its attachment/s other than by its intended
recipient/s is strictly prohibited and may be unlawful.

Internet Communications cannot be guaranteed to be secure or error-free as
information could be delayed, intercepted, corrupted, lost, or contain
viruses. Girnar does not accept any liability for any errors, omissions,
viruses or computer problems experienced by any recipient as a result of
this E-mail.
*

Re: Getting RetriesExhaustedException while getting rows

Posted by Ted Yu <yu...@gmail.com>.
0.90.5 is so old. Can you upgrade (0.94.4) ?

What did the log from 117.196.234.171 <http://117.196.234.171:60293/> look
like ?

Thanks

On Fri, Feb 8, 2013 at 9:52 AM, Vidosh Sahu <vi...@girnarsoft.com> wrote:

> Hi Ted,
>
> Thanks for the response.
>
> Hbase version - *0.90.5*
> *
> *
> Here is the RS log -
>
> ##########################################
> 2013-02-08 23:15:23,457 WARN org.apache.hadoop.ipc.HBaseServer: IPC Server
> Responder, call multi(org.apache.hadoop.hbase.client.MultiAction@80ecfd)
> from 117.196.234.171:60278: output error
> 2013-02-08 23:15:23,457 WARN org.apache.hadoop.ipc.HBaseServer: IPC Server
> handler 27 on 60020 caught: java.nio.channels.ClosedChannelException
> at sun.nio.ch.SocketChannelImpl.ensureWriteOpen(SocketChannelImpl.java:249)
>  at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:440)
> at org.apache.hadoop.hbase.ipc.HBaseServer.channelIO(HBaseServer.java:1389)
>  at
> org.apache.hadoop.hbase.ipc.HBaseServer.channelWrite(HBaseServer.java:1341)
> at
>
> org.apache.hadoop.hbase.ipc.HBaseServer$Responder.processResponse(HBaseServer.java:727)
>  at
>
> org.apache.hadoop.hbase.ipc.HBaseServer$Responder.doRespond(HBaseServer.java:792)
> at
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1083)
>
> 2013-02-08 23:17:20,153 WARN org.apache.hadoop.ipc.HBaseServer: IPC Server
> Responder, call multi(org.apache.hadoop.hbase.client.MultiAction@13dead6)
> from 117.196.234.171:60292: output error
> 2013-02-08 23:17:20,153 WARN org.apache.hadoop.ipc.HBaseServer: IPC Server
> handler 28 on 60020 caught: java.nio.channels.ClosedChannelException
> at sun.nio.ch.SocketChannelImpl.ensureWriteOpen(SocketChannelImpl.java:249)
>  at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:440)
> at org.apache.hadoop.hbase.ipc.HBaseServer.channelIO(HBaseServer.java:1389)
>  at
> org.apache.hadoop.hbase.ipc.HBaseServer.channelWrite(HBaseServer.java:1341)
> at
>
> org.apache.hadoop.hbase.ipc.HBaseServer$Responder.processResponse(HBaseServer.java:727)
>  at
>
> org.apache.hadoop.hbase.ipc.HBaseServer$Responder.doRespond(HBaseServer.java:792)
> at
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1083)
>
> 2013-02-08 23:18:23,343 DEBUG
> org.apache.hadoop.hbase.io.hfile.LruBlockCache: LRU Stats: total=956.76 KB,
> free=193.95 MB, max=194.89 MB, blocks=10, accesses=837, hits=111,
> hitRatio=13.26%%, cachingAccesses=121, cachingHits=111,
> cachingHitsRatio=91.73%%, evictions=0, evicted=0, evictedPerRun=NaN
> 2013-02-08 23:19:21,074 WARN org.apache.hadoop.ipc.HBaseServer: IPC Server
> Responder, call multi(org.apache.hadoop.hbase.client.MultiAction@965fce)
> from 117.196.234.171:60293: output error
> 2013-02-08 23:19:21,074 WARN org.apache.hadoop.ipc.HBaseServer: IPC Server
> handler 29 on 60020 caught: java.nio.channels.ClosedChannelException
> at sun.nio.ch.SocketChannelImpl.ensureWriteOpen(SocketChannelImpl.java:249)
>  at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:440)
> at org.apache.hadoop.hbase.ipc.HBaseServer.channelIO(HBaseServer.java:1389)
>  at
> org.apache.hadoop.hbase.ipc.HBaseServer.channelWrite(HBaseServer.java:1341)
> at
>
> org.apache.hadoop.hbase.ipc.HBaseServer$Responder.processResponse(HBaseServer.java:727)
>  at
>
> org.apache.hadoop.hbase.ipc.HBaseServer$Responder.doRespond(HBaseServer.java:792)
> at
> org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1083)
>
> ###########################################################
>
> Thanks,
> Vidosh
>
> On Fri, Feb 8, 2013 at 10:58 PM, Ted Yu <yu...@gmail.com> wrote:
>
> > Can you tell us the version of HBase you're using ?
> >
> > If you can publish region server log around time of such error, that
> would
> > help too.
> >
> > Please use pastebin for log snippet.
> >
> > Thanks
> >
> > On Fri, Feb 8, 2013 at 12:57 AM, Vidosh Sahu <vi...@girnarsoft.com>
> > wrote:
> >
> > > Hi,
> > >
> > > I have a pseudo-distributed setup of hbase/hadoop. I have 2 tables in
> > > hbase.
> > >
> > > I have a scheduler which runs at some frequency. Scheduler kicks three
> > > different mapper 1 by 1 (one the successful completion of the first
> one,
> > > the next one starts).
> > >
> > > When I ran it on lesser data it works fine, but in case of huge data it
> > > throws the following exception -
> > >
> > > org.apache.hadoop.hbase.client.RetriesExhaustedException: Trying to
> > contact
> > > region server null for region , row '*********************', but failed
> > > after 10 attempts.
> > > Exceptions:
> > > java.io.IOException:
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > > java.io.IOException:
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > > java.io.IOException:
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > > java.io.IOException:
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > > java.io.IOException:
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > > java.io.IOException:
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > > java.io.IOException:
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > > java.io.IOException:
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > > java.io.IOException:
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > > java.io.IOException:
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > >
> > > at
> > >
> > >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getRegionServerWithRetries(HConnectionManager.java:1017)
> > >  at org.apache.hadoop.hbase.client.HTable.get(HTable.java:549)
> > >
> > > Thanks,
> > > Vidosh
> > >
> >
>
>
>
> --
> *Warm Regards
>
> Vidosh Sahu | Technical Team Lead | Girnar Software Pvt Ltd |*
> *207 | Adarsh Nagar | Near Sardar Bhag Singh Ka Chouraha | Jaipur| 302004 |
> India |
> +91 141 422 4400 office | +91 9950751516 mobile | vidoshsahu skype | *
> www.girnarSoft.com <http://www.girnarsoft.com/>
> www.cardekho.com
>
> *Proud owners of www.CarDekho.com <http://www.cardekho.com/>, India's #1
> auto portal!*
> *
> *
> *
> Confidentiality Notice & Legal Disclaimer: This E-Mail Message (including
> attachments) may contain Confidential and/or legally privileged Information
> and is meant for the intended recipient(s) only. If you have received this
> e-mail in error and are not the intended recipient/s, kindly notify the
> sender and then delete this e-mail immediately from your system. You are
> also hereby notified that any use, any form of reproduction, dissemination,
> copying, disclosure, modification, distribution and/or publication of this
> e-mail, its contents or its attachment/s other than by its intended
> recipient/s is strictly prohibited and may be unlawful.
>
> Internet Communications cannot be guaranteed to be secure or error-free as
> information could be delayed, intercepted, corrupted, lost, or contain
> viruses. Girnar does not accept any liability for any errors, omissions,
> viruses or computer problems experienced by any recipient as a result of
> this E-mail.
> *
>

Re: Getting RetriesExhaustedException while getting rows

Posted by Vidosh Sahu <vi...@girnarsoft.com>.
Hi Ted,

Thanks for the response.

Hbase version - *0.90.5*
*
*
Here is the RS log -

##########################################
2013-02-08 23:15:23,457 WARN org.apache.hadoop.ipc.HBaseServer: IPC Server
Responder, call multi(org.apache.hadoop.hbase.client.MultiAction@80ecfd)
from 117.196.234.171:60278: output error
2013-02-08 23:15:23,457 WARN org.apache.hadoop.ipc.HBaseServer: IPC Server
handler 27 on 60020 caught: java.nio.channels.ClosedChannelException
at sun.nio.ch.SocketChannelImpl.ensureWriteOpen(SocketChannelImpl.java:249)
 at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:440)
at org.apache.hadoop.hbase.ipc.HBaseServer.channelIO(HBaseServer.java:1389)
 at
org.apache.hadoop.hbase.ipc.HBaseServer.channelWrite(HBaseServer.java:1341)
at
org.apache.hadoop.hbase.ipc.HBaseServer$Responder.processResponse(HBaseServer.java:727)
 at
org.apache.hadoop.hbase.ipc.HBaseServer$Responder.doRespond(HBaseServer.java:792)
at
org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1083)

2013-02-08 23:17:20,153 WARN org.apache.hadoop.ipc.HBaseServer: IPC Server
Responder, call multi(org.apache.hadoop.hbase.client.MultiAction@13dead6)
from 117.196.234.171:60292: output error
2013-02-08 23:17:20,153 WARN org.apache.hadoop.ipc.HBaseServer: IPC Server
handler 28 on 60020 caught: java.nio.channels.ClosedChannelException
at sun.nio.ch.SocketChannelImpl.ensureWriteOpen(SocketChannelImpl.java:249)
 at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:440)
at org.apache.hadoop.hbase.ipc.HBaseServer.channelIO(HBaseServer.java:1389)
 at
org.apache.hadoop.hbase.ipc.HBaseServer.channelWrite(HBaseServer.java:1341)
at
org.apache.hadoop.hbase.ipc.HBaseServer$Responder.processResponse(HBaseServer.java:727)
 at
org.apache.hadoop.hbase.ipc.HBaseServer$Responder.doRespond(HBaseServer.java:792)
at
org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1083)

2013-02-08 23:18:23,343 DEBUG
org.apache.hadoop.hbase.io.hfile.LruBlockCache: LRU Stats: total=956.76 KB,
free=193.95 MB, max=194.89 MB, blocks=10, accesses=837, hits=111,
hitRatio=13.26%%, cachingAccesses=121, cachingHits=111,
cachingHitsRatio=91.73%%, evictions=0, evicted=0, evictedPerRun=NaN
2013-02-08 23:19:21,074 WARN org.apache.hadoop.ipc.HBaseServer: IPC Server
Responder, call multi(org.apache.hadoop.hbase.client.MultiAction@965fce)
from 117.196.234.171:60293: output error
2013-02-08 23:19:21,074 WARN org.apache.hadoop.ipc.HBaseServer: IPC Server
handler 29 on 60020 caught: java.nio.channels.ClosedChannelException
at sun.nio.ch.SocketChannelImpl.ensureWriteOpen(SocketChannelImpl.java:249)
 at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:440)
at org.apache.hadoop.hbase.ipc.HBaseServer.channelIO(HBaseServer.java:1389)
 at
org.apache.hadoop.hbase.ipc.HBaseServer.channelWrite(HBaseServer.java:1341)
at
org.apache.hadoop.hbase.ipc.HBaseServer$Responder.processResponse(HBaseServer.java:727)
 at
org.apache.hadoop.hbase.ipc.HBaseServer$Responder.doRespond(HBaseServer.java:792)
at
org.apache.hadoop.hbase.ipc.HBaseServer$Handler.run(HBaseServer.java:1083)

###########################################################

Thanks,
Vidosh

On Fri, Feb 8, 2013 at 10:58 PM, Ted Yu <yu...@gmail.com> wrote:

> Can you tell us the version of HBase you're using ?
>
> If you can publish region server log around time of such error, that would
> help too.
>
> Please use pastebin for log snippet.
>
> Thanks
>
> On Fri, Feb 8, 2013 at 12:57 AM, Vidosh Sahu <vi...@girnarsoft.com>
> wrote:
>
> > Hi,
> >
> > I have a pseudo-distributed setup of hbase/hadoop. I have 2 tables in
> > hbase.
> >
> > I have a scheduler which runs at some frequency. Scheduler kicks three
> > different mapper 1 by 1 (one the successful completion of the first one,
> > the next one starts).
> >
> > When I ran it on lesser data it works fine, but in case of huge data it
> > throws the following exception -
> >
> > org.apache.hadoop.hbase.client.RetriesExhaustedException: Trying to
> contact
> > region server null for region , row '*********************', but failed
> > after 10 attempts.
> > Exceptions:
> > java.io.IOException:
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > java.io.IOException:
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > java.io.IOException:
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > java.io.IOException:
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > java.io.IOException:
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > java.io.IOException:
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > java.io.IOException:
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > java.io.IOException:
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > java.io.IOException:
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> > java.io.IOException:
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> >
> > at
> >
> >
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getRegionServerWithRetries(HConnectionManager.java:1017)
> >  at org.apache.hadoop.hbase.client.HTable.get(HTable.java:549)
> >
> > Thanks,
> > Vidosh
> >
>



-- 
*Warm Regards

Vidosh Sahu | Technical Team Lead | Girnar Software Pvt Ltd |*
*207 | Adarsh Nagar | Near Sardar Bhag Singh Ka Chouraha | Jaipur| 302004 |
India |
+91 141 422 4400 office | +91 9950751516 mobile | vidoshsahu skype | *
www.girnarSoft.com <http://www.girnarsoft.com/>
www.cardekho.com

*Proud owners of www.CarDekho.com <http://www.cardekho.com/>, India's #1
auto portal!*
*
*
*
Confidentiality Notice & Legal Disclaimer: This E-Mail Message (including
attachments) may contain Confidential and/or legally privileged Information
and is meant for the intended recipient(s) only. If you have received this
e-mail in error and are not the intended recipient/s, kindly notify the
sender and then delete this e-mail immediately from your system. You are
also hereby notified that any use, any form of reproduction, dissemination,
copying, disclosure, modification, distribution and/or publication of this
e-mail, its contents or its attachment/s other than by its intended
recipient/s is strictly prohibited and may be unlawful.

Internet Communications cannot be guaranteed to be secure or error-free as
information could be delayed, intercepted, corrupted, lost, or contain
viruses. Girnar does not accept any liability for any errors, omissions,
viruses or computer problems experienced by any recipient as a result of
this E-mail.
*

Re: Getting RetriesExhaustedException while getting rows

Posted by Ted Yu <yu...@gmail.com>.
Can you tell us the version of HBase you're using ?

If you can publish region server log around time of such error, that would
help too.

Please use pastebin for log snippet.

Thanks

On Fri, Feb 8, 2013 at 12:57 AM, Vidosh Sahu <vi...@girnarsoft.com> wrote:

> Hi,
>
> I have a pseudo-distributed setup of hbase/hadoop. I have 2 tables in
> hbase.
>
> I have a scheduler which runs at some frequency. Scheduler kicks three
> different mapper 1 by 1 (one the successful completion of the first one,
> the next one starts).
>
> When I ran it on lesser data it works fine, but in case of huge data it
> throws the following exception -
>
> org.apache.hadoop.hbase.client.RetriesExhaustedException: Trying to contact
> region server null for region , row '*********************', but failed
> after 10 attempts.
> Exceptions:
> java.io.IOException:
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> java.io.IOException:
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> java.io.IOException:
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> java.io.IOException:
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> java.io.IOException:
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> java.io.IOException:
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> java.io.IOException:
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> java.io.IOException:
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> java.io.IOException:
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
> java.io.IOException:
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation@81f25closed
>
> at
>
> org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getRegionServerWithRetries(HConnectionManager.java:1017)
>  at org.apache.hadoop.hbase.client.HTable.get(HTable.java:549)
>
> Thanks,
> Vidosh
>