You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by "Raja.Aravapalli" <Ra...@target.com> on 2016/07/11 20:27:23 UTC

Caused by: java.io.InterruptedIOException

Hi,


One of applications which does put operations on Hbase table, is failing with below exception in the log. Can someone please debug the issue and fix this. I am new to Hbase.


Caused by: java.io.InterruptedIOException: #2, interrupted. currentNumberOfTask=1
        at org.apache.hadoop.hbase.client.AsyncProcess.waitForMaximumCurrentTasks(AsyncProcess.java:1661)
        at org.apache.hadoop.hbase.client.AsyncProcess.waitForAllPreviousOpsAndReset(AsyncProcess.java:1687)
        at org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCommits(BufferedMutatorImpl.java:208)
        at org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMutatorImpl.java:183)
        at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1449)
        at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1040)




Regards,
Raja.

Re: Caused by: java.io.InterruptedIOException

Posted by "Raja.Aravapalli" <Ra...@target.com>.
Sure. Thanks Dima. 


Regards,
Raja.




On 7/11/16, 5:51 PM, "Dima Spivak" <ds...@cloudera.com> wrote:

>Hm, sorry to say that you might have better luck at getting to the bottom
>of this at Hortonworks’ user forums since people there are more likely to
>have played with Apex and the like. The stack trace you included is
>actually a pretty generic one so it’d be hard for someone to see those
>lines and chime in with a specific cause without way more details about
>your use case and the configuration of your cluster. If I had to guess,
>there’s probably some tuning that might be needed within your HBase/Apex
>configuration, but you’re venturing into things that people familiar with
>HDP are more likely to be able to help out with.
>
>-Dima
>
>On Mon, Jul 11, 2016 at 4:59 PM, Raja.Aravapalli <Raja.Aravapalli@target.com
>> wrote:
>
>>
>>
>> Thanks for the response Dima,
>>
>> Please find below some of the details:
>>
>>
>> Hadoop Distribution: Hortonworks
>> Version: HBase 1.1.2.2.3.4.0
>> Cluster size: 40nodes
>>
>> About Application:
>>
>> We have an Apache Apex application, where in we have one of the programs,
>> which runs in 3 instances and does PUT operations simultaneously on a same
>> Hbase table.
>>
>>
>> Application is running fine from 3 -4 days, Although application recovered
>> automatically, want to know more details on these exceptions in the log:
>>
>>
>> Caused by: java.io.InterruptedIOException: #2, interrupted.
>> currentNumberOfTask=1
>>          at
>>
>> org.apache.hadoop.hbase.client.AsyncProcess.waitForMaximumCurrentTasks(AsyncProcess.java:1661)
>>          at
>>
>> org.apache.hadoop.hbase.client.AsyncProcess.waitForAllPreviousOpsAndReset(AsyncProcess.java:1687)
>>          at
>>
>> org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCommits(BufferedMutatorImpl.java:208)
>>          at
>>
>> org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMutatorImpl.java:183)
>>          at
>> org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1449)
>>          at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1040)
>>
>>
>>
>> Thanks a lot in advance.
>>
>>
>> Regards,
>> Raja.
>>
>>
>>
>> On 7/11/16, 4:10 PM, "Dima Spivak" <ds...@cloudera.com> wrote:
>>
>> >Hey Raja,
>> >
>> >We'll need more details about your setup (HBase version, size/topology of
>> >cluster, server specs, etc.) and the applications you're running before we
>> >can even start giving ideas of things to try. Wanna pass those along?
>> >
>> >-Dima
>> >
>> >On Monday, July 11, 2016, Raja.Aravapalli <Ra...@target.com>
>> >wrote:
>> >
>> >>
>> >> Hi,
>> >>
>> >>
>> >> One of applications which does put operations on Hbase table, is failing
>> >> with below exception in the log. Can someone please debug the issue and
>> fix
>> >> this. I am new to Hbase.
>> >>
>> >>
>> >> Caused by: java.io.InterruptedIOException: #2, interrupted.
>> >> currentNumberOfTask=1
>> >>         at
>> >>
>> org.apache.hadoop.hbase.client.AsyncProcess.waitForMaximumCurrentTasks(AsyncProcess.java:1661)
>> >>         at
>> >>
>> org.apache.hadoop.hbase.client.AsyncProcess.waitForAllPreviousOpsAndReset(AsyncProcess.java:1687)
>> >>         at
>> >>
>> org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCommits(BufferedMutatorImpl.java:208)
>> >>         at
>> >>
>> org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMutatorImpl.java:183)
>> >>         at
>> >> org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1449)
>> >>         at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1040)
>> >>
>> >>
>> >>
>> >>
>> >> Regards,
>> >> Raja.
>> >>
>>

Re: Caused by: java.io.InterruptedIOException

Posted by Dima Spivak <ds...@cloudera.com>.
Hm, sorry to say that you might have better luck at getting to the bottom
of this at Hortonworks’ user forums since people there are more likely to
have played with Apex and the like. The stack trace you included is
actually a pretty generic one so it’d be hard for someone to see those
lines and chime in with a specific cause without way more details about
your use case and the configuration of your cluster. If I had to guess,
there’s probably some tuning that might be needed within your HBase/Apex
configuration, but you’re venturing into things that people familiar with
HDP are more likely to be able to help out with.

-Dima

On Mon, Jul 11, 2016 at 4:59 PM, Raja.Aravapalli <Raja.Aravapalli@target.com
> wrote:

>
>
> Thanks for the response Dima,
>
> Please find below some of the details:
>
>
> Hadoop Distribution: Hortonworks
> Version: HBase 1.1.2.2.3.4.0
> Cluster size: 40nodes
>
> About Application:
>
> We have an Apache Apex application, where in we have one of the programs,
> which runs in 3 instances and does PUT operations simultaneously on a same
> Hbase table.
>
>
> Application is running fine from 3 -4 days, Although application recovered
> automatically, want to know more details on these exceptions in the log:
>
>
> Caused by: java.io.InterruptedIOException: #2, interrupted.
> currentNumberOfTask=1
>          at
>
> org.apache.hadoop.hbase.client.AsyncProcess.waitForMaximumCurrentTasks(AsyncProcess.java:1661)
>          at
>
> org.apache.hadoop.hbase.client.AsyncProcess.waitForAllPreviousOpsAndReset(AsyncProcess.java:1687)
>          at
>
> org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCommits(BufferedMutatorImpl.java:208)
>          at
>
> org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMutatorImpl.java:183)
>          at
> org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1449)
>          at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1040)
>
>
>
> Thanks a lot in advance.
>
>
> Regards,
> Raja.
>
>
>
> On 7/11/16, 4:10 PM, "Dima Spivak" <ds...@cloudera.com> wrote:
>
> >Hey Raja,
> >
> >We'll need more details about your setup (HBase version, size/topology of
> >cluster, server specs, etc.) and the applications you're running before we
> >can even start giving ideas of things to try. Wanna pass those along?
> >
> >-Dima
> >
> >On Monday, July 11, 2016, Raja.Aravapalli <Ra...@target.com>
> >wrote:
> >
> >>
> >> Hi,
> >>
> >>
> >> One of applications which does put operations on Hbase table, is failing
> >> with below exception in the log. Can someone please debug the issue and
> fix
> >> this. I am new to Hbase.
> >>
> >>
> >> Caused by: java.io.InterruptedIOException: #2, interrupted.
> >> currentNumberOfTask=1
> >>         at
> >>
> org.apache.hadoop.hbase.client.AsyncProcess.waitForMaximumCurrentTasks(AsyncProcess.java:1661)
> >>         at
> >>
> org.apache.hadoop.hbase.client.AsyncProcess.waitForAllPreviousOpsAndReset(AsyncProcess.java:1687)
> >>         at
> >>
> org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCommits(BufferedMutatorImpl.java:208)
> >>         at
> >>
> org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMutatorImpl.java:183)
> >>         at
> >> org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1449)
> >>         at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1040)
> >>
> >>
> >>
> >>
> >> Regards,
> >> Raja.
> >>
>

Re: Caused by: java.io.InterruptedIOException

Posted by "Raja.Aravapalli" <Ra...@target.com>.

Thanks for the response Dima, 

Please find below some of the details:


Hadoop Distribution: Hortonworks
Version: HBase 1.1.2.2.3.4.0
Cluster size: 40nodes

About Application:

We have an Apache Apex application, where in we have one of the programs, which runs in 3 instances and does PUT operations simultaneously on a same Hbase table.


Application is running fine from 3 -4 days, Although application recovered automatically, want to know more details on these exceptions in the log:


Caused by: java.io.InterruptedIOException: #2, interrupted.
currentNumberOfTask=1
         at
org.apache.hadoop.hbase.client.AsyncProcess.waitForMaximumCurrentTasks(AsyncProcess.java:1661)
         at
org.apache.hadoop.hbase.client.AsyncProcess.waitForAllPreviousOpsAndReset(AsyncProcess.java:1687)
         at
org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCommits(BufferedMutatorImpl.java:208)
         at
org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMutatorImpl.java:183)
         at
org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1449)
         at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1040)



Thanks a lot in advance.


Regards, 
Raja.



On 7/11/16, 4:10 PM, "Dima Spivak" <ds...@cloudera.com> wrote:

>Hey Raja,
>
>We'll need more details about your setup (HBase version, size/topology of
>cluster, server specs, etc.) and the applications you're running before we
>can even start giving ideas of things to try. Wanna pass those along?
>
>-Dima
>
>On Monday, July 11, 2016, Raja.Aravapalli <Ra...@target.com>
>wrote:
>
>>
>> Hi,
>>
>>
>> One of applications which does put operations on Hbase table, is failing
>> with below exception in the log. Can someone please debug the issue and fix
>> this. I am new to Hbase.
>>
>>
>> Caused by: java.io.InterruptedIOException: #2, interrupted.
>> currentNumberOfTask=1
>>         at
>> org.apache.hadoop.hbase.client.AsyncProcess.waitForMaximumCurrentTasks(AsyncProcess.java:1661)
>>         at
>> org.apache.hadoop.hbase.client.AsyncProcess.waitForAllPreviousOpsAndReset(AsyncProcess.java:1687)
>>         at
>> org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCommits(BufferedMutatorImpl.java:208)
>>         at
>> org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMutatorImpl.java:183)
>>         at
>> org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1449)
>>         at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1040)
>>
>>
>>
>>
>> Regards,
>> Raja.
>>

Re: Caused by: java.io.InterruptedIOException

Posted by Dima Spivak <ds...@cloudera.com>.
Hey Raja,

We'll need more details about your setup (HBase version, size/topology of
cluster, server specs, etc.) and the applications you're running before we
can even start giving ideas of things to try. Wanna pass those along?

-Dima

On Monday, July 11, 2016, Raja.Aravapalli <Ra...@target.com>
wrote:

>
> Hi,
>
>
> One of applications which does put operations on Hbase table, is failing
> with below exception in the log. Can someone please debug the issue and fix
> this. I am new to Hbase.
>
>
> Caused by: java.io.InterruptedIOException: #2, interrupted.
> currentNumberOfTask=1
>         at
> org.apache.hadoop.hbase.client.AsyncProcess.waitForMaximumCurrentTasks(AsyncProcess.java:1661)
>         at
> org.apache.hadoop.hbase.client.AsyncProcess.waitForAllPreviousOpsAndReset(AsyncProcess.java:1687)
>         at
> org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCommits(BufferedMutatorImpl.java:208)
>         at
> org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMutatorImpl.java:183)
>         at
> org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1449)
>         at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1040)
>
>
>
>
> Regards,
> Raja.
>