You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@hbase.apache.org by "Clint Morgan (JIRA)" <ji...@apache.org> on 2008/04/01 01:42:24 UTC

[jira] Updated: (HBASE-554) filters generate StackOverflowException

     [ https://issues.apache.org/jira/browse/HBASE-554?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Clint Morgan updated HBASE-554:
-------------------------------

    Attachment: hbase-554.patch

The culprit is in the last patch I submitted. Upon filtering an assembled row, it recursively calls next. Technically this is a tail-rec call, but I guess not all compilers will recognize this.

This patch uses explicit iteration instead. Let me know if it works for you...

> filters generate StackOverflowException
> ---------------------------------------
>
>                 Key: HBASE-554
>                 URL: https://issues.apache.org/jira/browse/HBASE-554
>             Project: Hadoop HBase
>          Issue Type: Bug
>          Components: filters
>    Affects Versions: 0.16.0, 0.2.0, 0.1.0
>            Reporter: stack
>         Attachments: hbase-554.patch
>
>
> Below is from list.
> You're doing nothing wrong.
> The filters as written recurse until they find a match.  If long stretches between matching rows, then you will get a StackOverflowError.  Filters need to be changed.  Thanks for pointing this out.  Can you do without them for the moment until we get a chance to fix it?
> St.Ack
> David Alves wrote:
> > Hi St.Ack and all
> > 	
> > 	The error always occurs when trying to see if there are more rows to
> > process.
> > 	Yes I'm using a filter(RegExpRowFilter) to select only the rows (any
> > row key) that match a specific value in one of the columns.
> > 	Then I obtain the scanner just test the hasNext method, close the
> > scanner and return.
> > 	Am I doing something wrong?
> > 	Still StackOverflowError is not supposed to happen right?
> >
> > Regards
> > David Alves
> > On Thu, 2008-03-27 at 12:36 -0700, stack wrote:
> >> You are using a filter?  If so, tell us more about it.
> >> St.Ack
> >>
> >> David Alves wrote:
> >>> Hi guys 
> >>>
> >>> 	I 'm using HBase to keep data that is later indexed.
> >>> 	The data is indexed in chunks so the cycle is get XXXX records index
> >>> them check for more records etc...
> >>> 	When I tryed the candidate-2 instead of the old 0.16.0 (which I
> >>> switched to do to the regionservers becoming unresponsive) I got the
> >>> error in the end of this email well into an indexing job.
> >>> 	So you have any idea why? Am I doing something wrong?
> >>>
> >>> David Alves
> >>>
> >>> java.lang.RuntimeException: org.apache.hadoop.ipc.RemoteException:
> >>> java.io.IOException: java.lang.StackOverflowError
> >>>         at java.io.DataInputStream.readFully(DataInputStream.java:178)
> >>>         at java.io.DataInputStream.readLong(DataInputStream.java:399)
> >>>         at org.apache.hadoop.dfs.DFSClient
> >>> $BlockReader.readChunk(DFSClient.java:735)
> >>>         at
> >>> org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:234)
> >>>         at
> >>> org.apache.hadoop.fs.FSInputChecker.fill(FSInputChecker.java:176)
> >>>         at
> >>> org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:193)
> >>>         at
> >>> org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:157)
> >>>         at org.apache.hadoop.dfs.DFSClient
> >>> $BlockReader.read(DFSClient.java:658)
> >>>         at org.apache.hadoop.dfs.DFSClient
> >>> $DFSInputStream.readBuffer(DFSClient.java:1130)
> >>>         at org.apache.hadoop.dfs.DFSClient
> >>> $DFSInputStream.read(DFSClient.java:1166)
> >>>         at java.io.DataInputStream.readFully(DataInputStream.java:178)
> >>>         at org.apache.hadoop.io.DataOutputBuffer
> >>> $Buffer.write(DataOutputBuffer.java:56)
> >>>         at
> >>> org.apache.hadoop.io.DataOutputBuffer.write(DataOutputBuffer.java:90)
> >>>         at org.apache.hadoop.io.SequenceFile
> >>> $Reader.next(SequenceFile.java:1829)
> >>>         at org.apache.hadoop.io.SequenceFile
> >>> $Reader.next(SequenceFile.java:1729)
> >>>         at org.apache.hadoop.io.SequenceFile
> >>> $Reader.next(SequenceFile.java:1775)
> >>>         at org.apache.hadoop.io.MapFile$Reader.next(MapFile.java:461)
> >>>         at org.apache.hadoop.hbase.HStore
> >>> $StoreFileScanner.getNext(HStore.java:2350)
> >>>         at
> >>> org.apache.hadoop.hbase.HAbstractScanner.next(HAbstractScanner.java:256)
> >>>         at org.apache.hadoop.hbase.HStore
> >>> $HStoreScanner.next(HStore.java:2561)
> >>>         at org.apache.hadoop.hbase.HRegion
> >>> $HScanner.next(HRegion.java:1807)
> >>>         at org.apache.hadoop.hbase.HRegion
> >>> $HScanner.next(HRegion.java:1843)
> >>>         at org.apache.hadoop.hbase.HRegion
> >>> $HScanner.next(HRegion.java:1843)
> >>>         at org.apache.hadoop.hbase.HRegion
> >>> $HScanner.next(HRegion.java:1843)
> >>>         at org.apache.hadoop.hbase.HRegion
> >>> $HScanner.next(HRegion.java:1843)
> >>>         at org.apache.hadoop.hbase.HRegion
> >>> $HScanner.next(HRegion.java:1843)
> >>>         at org.apache.hadoop.hbase.HRegion
> >>> $HScanner.next(HRegion.java:1843)
> >>>         at org.apache.hadoop.hbase.HRegion
> >>> $HScanner.next(HRegion.java:1843)
> >>>         at org.apache.hadoop.hbase.HRegion
> >>> $HScanner.next(HRegion.java:1843)
> >>>         at org.apache.hadoop.hbase.HRegion
> >>> $HScanner.next(HRegion.java:1843)
> >>>         at org.apache.hadoop.hbase.HRegion
> >>> $HScanner.next(HRegion.java:1843)
> >>>         at org.apache.hadoop.hbase.HRegion
> >>> $HScanner.next(HRegion.java:1843)
> >>>         at org.apache.hadoop.hbase.HRegion
> >>> $HScanner.next(HRegion.java:1843)
> >>>         at org.apache.hadoop.hbase.HRegion
> >>> $HScanner.next(HRegion.java:1843)
> >>>         at org.apache.hadoop.hbase.HRegion
> >>> $HScanner.next(HRegion.java:1843)
> >>>         at org.apache.hadoop.hbase.HRegion
> >>> $HScanner.next(HRegion.java:1843)
> >>>         at org.apache.hadoop.hbase.HRegion
> >>> $HScanner.next(HRegion.java:1843)
> >>>         at org.apache.hadoop.hbase.HRegion
> >>> $HScanner.next(HRegion.java:1843)
> >>> ...

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.