You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Bryan Baugher <bj...@gmail.com> on 2013/02/20 23:05:54 UTC

Custom HBase Filter : Error in readFields

Hi everyone,

I am trying to write my own custom Filter but I have been having issues.
When there is only 1 region in my table the scan works as expected but when
there is more, it attempts to create a new version of my filter and
deserialize the information again but the data seems to be gone. I am
running HBase 0.92.1-cdh4.1.1.

2013-02-20 15:39:53,220 DEBUG com.cerner.kepler.filters.RowRangeFilter:
Reading fields
2013-02-20 15:40:08,612 WARN org.apache.hadoop.hbase.util.Sleeper: We slept
15346ms instead of 3000ms, this is likely due to a long garbage collecting
pause and it's usually bad, see
http://hbase.apache.org/book.html#trouble.rs.runtime.zkexpired
2013-02-20 15:40:09,142 ERROR
org.apache.hadoop.hbase.io.HbaseObjectWritable: Error in readFields
java.lang.ArrayIndexOutOfBoundsException
        at java.lang.System.arraycopy(Native Method)
        at java.io.ByteArrayInputStream.read(ByteArrayInputStream.java:174)
        at java.io.DataInputStream.readFully(DataInputStream.java:178)
        at java.io.DataInputStream.readFully(DataInputStream.java:152)
        at
com.cerner.kepler.filters.RowRangeFilter.readFields(RowRangeFilter.java:226)
        at org.apache.hadoop.hbase.client.Scan.readFields(Scan.java:548)
        at
org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:652)
        at
org.apache.hadoop.hbase.ipc.Invocation.readFields(Invocation.java:125)
        at
org.apache.hadoop.hbase.ipc.HBaseServer$Connection.processData(HBaseServer.java:1254)
        at
org.apache.hadoop.hbase.ipc.HBaseServer$Connection.readAndProcess(HBaseServer.java:1183)
        at
org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:719)
        at
org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.doRunLoop(HBaseServer.java:511)
        at
org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:486)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
        at java.lang.Thread.run(Thread.java:662)
2013-02-20 15:40:17,498 WARN org.apache.hadoop.ipc.HBaseServer: Unable to
read call parameters for client ***
java.io.IOException: Error in readFields
        at
org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:655)
        at
org.apache.hadoop.hbase.ipc.Invocation.readFields(Invocation.java:125)
        at
org.apache.hadoop.hbase.ipc.HBaseServer$Connection.processData(HBaseServer.java:1254)
        at
org.apache.hadoop.hbase.ipc.HBaseServer$Connection.readAndProcess(HBaseServer.java:1183)
        at
org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:719)
        at
org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.doRunLoop(HBaseServer.java:511)
        at
org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:486)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
        at java.lang.Thread.run(Thread.java:662)
Caused by: java.lang.ArrayIndexOutOfBoundsException
        at java.lang.System.arraycopy(Native Method)
        at java.io.ByteArrayInputStream.read(ByteArrayInputStream.java:174)
        at java.io.DataInputStream.readFully(DataInputStream.java:178)
        at java.io.DataInputStream.readFully(DataInputStream.java:152)
        at
com.cerner.kepler.filters.RowRangeFilter.readFields(RowRangeFilter.java:226)
        at org.apache.hadoop.hbase.client.Scan.readFields(Scan.java:548)
        at
org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:652)
        ... 9 more

-Bryan

Re: Custom HBase Filter : Error in readFields

Posted by Ted Yu <yu...@gmail.com>.
Bryan:
Looks like you may have missed adding unit test for your filter.

Unit test should have caught this situation much earlier.

Cheers

On Wed, Feb 20, 2013 at 3:42 PM, Viral Bajaria <vi...@gmail.com>wrote:

> Also the readFields is your implementation of how to read the byte array
> transferred from the client. So I think there has to be some issue in how
> you write the byte array to the network and what you are reading out of
> that i.e. the size of arrays might not be identical.
>
> But as Ted mentioned, looking at the code will help troubleshoot it better.
>
> On Wed, Feb 20, 2013 at 3:32 PM, Ted Yu <yu...@gmail.com> wrote:
>
> > If you show us the code for RowRangeFilter, that would help us
> > troubleshoot.
> >
> > Cheers
> >
> > On Wed, Feb 20, 2013 at 2:05 PM, Bryan Baugher <bj...@gmail.com> wrote:
> >
> > > Hi everyone,
> > >
> > > I am trying to write my own custom Filter but I have been having
> issues.
> > > When there is only 1 region in my table the scan works as expected but
> > when
> > > there is more, it attempts to create a new version of my filter and
> > > deserialize the information again but the data seems to be gone. I am
> > > running HBase 0.92.1-cdh4.1.1.
> > >
> > > 2013-02-20 15:39:53,220 DEBUG com.cerner.kepler.filters.RowRangeFilter:
> > > Reading fields
> > > 2013-02-20 15:40:08,612 WARN org.apache.hadoop.hbase.util.Sleeper: We
> > slept
> > > 15346ms instead of 3000ms, this is likely due to a long garbage
> > collecting
> > > pause and it's usually bad, see
> > > http://hbase.apache.org/book.html#trouble.rs.runtime.zkexpired
> > > 2013-02-20 15:40:09,142 ERROR
> > > org.apache.hadoop.hbase.io.HbaseObjectWritable: Error in readFields
> > > java.lang.ArrayIndexOutOfBoundsException
> > >         at java.lang.System.arraycopy(Native Method)
> > >         at
> > java.io.ByteArrayInputStream.read(ByteArrayInputStream.java:174)
> > >         at java.io.DataInputStream.readFully(DataInputStream.java:178)
> > >         at java.io.DataInputStream.readFully(DataInputStream.java:152)
> > >         at
> > >
> > >
> >
> com.cerner.kepler.filters.RowRangeFilter.readFields(RowRangeFilter.java:226)
> > >         at
> org.apache.hadoop.hbase.client.Scan.readFields(Scan.java:548)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:652)
> > >         at
> > > org.apache.hadoop.hbase.ipc.Invocation.readFields(Invocation.java:125)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Connection.processData(HBaseServer.java:1254)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Connection.readAndProcess(HBaseServer.java:1183)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:719)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.doRunLoop(HBaseServer.java:511)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:486)
> > >         at
> > >
> > >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> > >         at
> > >
> > >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> > >         at java.lang.Thread.run(Thread.java:662)
> > > 2013-02-20 15:40:17,498 WARN org.apache.hadoop.ipc.HBaseServer: Unable
> to
> > > read call parameters for client ***
> > > java.io.IOException: Error in readFields
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:655)
> > >         at
> > > org.apache.hadoop.hbase.ipc.Invocation.readFields(Invocation.java:125)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Connection.processData(HBaseServer.java:1254)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Connection.readAndProcess(HBaseServer.java:1183)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:719)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.doRunLoop(HBaseServer.java:511)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:486)
> > >         at
> > >
> > >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> > >         at
> > >
> > >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> > >         at java.lang.Thread.run(Thread.java:662)
> > > Caused by: java.lang.ArrayIndexOutOfBoundsException
> > >         at java.lang.System.arraycopy(Native Method)
> > >         at
> > java.io.ByteArrayInputStream.read(ByteArrayInputStream.java:174)
> > >         at java.io.DataInputStream.readFully(DataInputStream.java:178)
> > >         at java.io.DataInputStream.readFully(DataInputStream.java:152)
> > >         at
> > >
> > >
> >
> com.cerner.kepler.filters.RowRangeFilter.readFields(RowRangeFilter.java:226)
> > >         at
> org.apache.hadoop.hbase.client.Scan.readFields(Scan.java:548)
> > >         at
> > >
> > >
> >
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:652)
> > >         ... 9 more
> > >
> > > -Bryan
> > >
> >
>

Re: Custom HBase Filter : Error in readFields

Posted by Viral Bajaria <vi...@gmail.com>.
Also the readFields is your implementation of how to read the byte array
transferred from the client. So I think there has to be some issue in how
you write the byte array to the network and what you are reading out of
that i.e. the size of arrays might not be identical.

But as Ted mentioned, looking at the code will help troubleshoot it better.

On Wed, Feb 20, 2013 at 3:32 PM, Ted Yu <yu...@gmail.com> wrote:

> If you show us the code for RowRangeFilter, that would help us
> troubleshoot.
>
> Cheers
>
> On Wed, Feb 20, 2013 at 2:05 PM, Bryan Baugher <bj...@gmail.com> wrote:
>
> > Hi everyone,
> >
> > I am trying to write my own custom Filter but I have been having issues.
> > When there is only 1 region in my table the scan works as expected but
> when
> > there is more, it attempts to create a new version of my filter and
> > deserialize the information again but the data seems to be gone. I am
> > running HBase 0.92.1-cdh4.1.1.
> >
> > 2013-02-20 15:39:53,220 DEBUG com.cerner.kepler.filters.RowRangeFilter:
> > Reading fields
> > 2013-02-20 15:40:08,612 WARN org.apache.hadoop.hbase.util.Sleeper: We
> slept
> > 15346ms instead of 3000ms, this is likely due to a long garbage
> collecting
> > pause and it's usually bad, see
> > http://hbase.apache.org/book.html#trouble.rs.runtime.zkexpired
> > 2013-02-20 15:40:09,142 ERROR
> > org.apache.hadoop.hbase.io.HbaseObjectWritable: Error in readFields
> > java.lang.ArrayIndexOutOfBoundsException
> >         at java.lang.System.arraycopy(Native Method)
> >         at
> java.io.ByteArrayInputStream.read(ByteArrayInputStream.java:174)
> >         at java.io.DataInputStream.readFully(DataInputStream.java:178)
> >         at java.io.DataInputStream.readFully(DataInputStream.java:152)
> >         at
> >
> >
> com.cerner.kepler.filters.RowRangeFilter.readFields(RowRangeFilter.java:226)
> >         at org.apache.hadoop.hbase.client.Scan.readFields(Scan.java:548)
> >         at
> >
> >
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:652)
> >         at
> > org.apache.hadoop.hbase.ipc.Invocation.readFields(Invocation.java:125)
> >         at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Connection.processData(HBaseServer.java:1254)
> >         at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Connection.readAndProcess(HBaseServer.java:1183)
> >         at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:719)
> >         at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.doRunLoop(HBaseServer.java:511)
> >         at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:486)
> >         at
> >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> >         at
> >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> >         at java.lang.Thread.run(Thread.java:662)
> > 2013-02-20 15:40:17,498 WARN org.apache.hadoop.ipc.HBaseServer: Unable to
> > read call parameters for client ***
> > java.io.IOException: Error in readFields
> >         at
> >
> >
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:655)
> >         at
> > org.apache.hadoop.hbase.ipc.Invocation.readFields(Invocation.java:125)
> >         at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Connection.processData(HBaseServer.java:1254)
> >         at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Connection.readAndProcess(HBaseServer.java:1183)
> >         at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:719)
> >         at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.doRunLoop(HBaseServer.java:511)
> >         at
> >
> >
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:486)
> >         at
> >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
> >         at
> >
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
> >         at java.lang.Thread.run(Thread.java:662)
> > Caused by: java.lang.ArrayIndexOutOfBoundsException
> >         at java.lang.System.arraycopy(Native Method)
> >         at
> java.io.ByteArrayInputStream.read(ByteArrayInputStream.java:174)
> >         at java.io.DataInputStream.readFully(DataInputStream.java:178)
> >         at java.io.DataInputStream.readFully(DataInputStream.java:152)
> >         at
> >
> >
> com.cerner.kepler.filters.RowRangeFilter.readFields(RowRangeFilter.java:226)
> >         at org.apache.hadoop.hbase.client.Scan.readFields(Scan.java:548)
> >         at
> >
> >
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:652)
> >         ... 9 more
> >
> > -Bryan
> >
>

Re: Custom HBase Filter : Error in readFields

Posted by Ted Yu <yu...@gmail.com>.
If you show us the code for RowRangeFilter, that would help us troubleshoot.

Cheers

On Wed, Feb 20, 2013 at 2:05 PM, Bryan Baugher <bj...@gmail.com> wrote:

> Hi everyone,
>
> I am trying to write my own custom Filter but I have been having issues.
> When there is only 1 region in my table the scan works as expected but when
> there is more, it attempts to create a new version of my filter and
> deserialize the information again but the data seems to be gone. I am
> running HBase 0.92.1-cdh4.1.1.
>
> 2013-02-20 15:39:53,220 DEBUG com.cerner.kepler.filters.RowRangeFilter:
> Reading fields
> 2013-02-20 15:40:08,612 WARN org.apache.hadoop.hbase.util.Sleeper: We slept
> 15346ms instead of 3000ms, this is likely due to a long garbage collecting
> pause and it's usually bad, see
> http://hbase.apache.org/book.html#trouble.rs.runtime.zkexpired
> 2013-02-20 15:40:09,142 ERROR
> org.apache.hadoop.hbase.io.HbaseObjectWritable: Error in readFields
> java.lang.ArrayIndexOutOfBoundsException
>         at java.lang.System.arraycopy(Native Method)
>         at java.io.ByteArrayInputStream.read(ByteArrayInputStream.java:174)
>         at java.io.DataInputStream.readFully(DataInputStream.java:178)
>         at java.io.DataInputStream.readFully(DataInputStream.java:152)
>         at
>
> com.cerner.kepler.filters.RowRangeFilter.readFields(RowRangeFilter.java:226)
>         at org.apache.hadoop.hbase.client.Scan.readFields(Scan.java:548)
>         at
>
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:652)
>         at
> org.apache.hadoop.hbase.ipc.Invocation.readFields(Invocation.java:125)
>         at
>
> org.apache.hadoop.hbase.ipc.HBaseServer$Connection.processData(HBaseServer.java:1254)
>         at
>
> org.apache.hadoop.hbase.ipc.HBaseServer$Connection.readAndProcess(HBaseServer.java:1183)
>         at
>
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:719)
>         at
>
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.doRunLoop(HBaseServer.java:511)
>         at
>
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:486)
>         at
>
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>         at
>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>         at java.lang.Thread.run(Thread.java:662)
> 2013-02-20 15:40:17,498 WARN org.apache.hadoop.ipc.HBaseServer: Unable to
> read call parameters for client ***
> java.io.IOException: Error in readFields
>         at
>
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:655)
>         at
> org.apache.hadoop.hbase.ipc.Invocation.readFields(Invocation.java:125)
>         at
>
> org.apache.hadoop.hbase.ipc.HBaseServer$Connection.processData(HBaseServer.java:1254)
>         at
>
> org.apache.hadoop.hbase.ipc.HBaseServer$Connection.readAndProcess(HBaseServer.java:1183)
>         at
>
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener.doRead(HBaseServer.java:719)
>         at
>
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.doRunLoop(HBaseServer.java:511)
>         at
>
> org.apache.hadoop.hbase.ipc.HBaseServer$Listener$Reader.run(HBaseServer.java:486)
>         at
>
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
>         at
>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
>         at java.lang.Thread.run(Thread.java:662)
> Caused by: java.lang.ArrayIndexOutOfBoundsException
>         at java.lang.System.arraycopy(Native Method)
>         at java.io.ByteArrayInputStream.read(ByteArrayInputStream.java:174)
>         at java.io.DataInputStream.readFully(DataInputStream.java:178)
>         at java.io.DataInputStream.readFully(DataInputStream.java:152)
>         at
>
> com.cerner.kepler.filters.RowRangeFilter.readFields(RowRangeFilter.java:226)
>         at org.apache.hadoop.hbase.client.Scan.readFields(Scan.java:548)
>         at
>
> org.apache.hadoop.hbase.io.HbaseObjectWritable.readObject(HbaseObjectWritable.java:652)
>         ... 9 more
>
> -Bryan
>