You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@hbase.apache.org by "Zhihong Yu (Created) (JIRA)" <ji...@apache.org> on 2012/02/19 20:50:34 UTC

[jira] [Created] (HBASE-5435) TestForceCacheImportantBlocks fails with OutOfMemoryError

TestForceCacheImportantBlocks fails with OutOfMemoryError
---------------------------------------------------------

                 Key: HBASE-5435
                 URL: https://issues.apache.org/jira/browse/HBASE-5435
             Project: HBase
          Issue Type: Test
            Reporter: Zhihong Yu
             Fix For: 0.94.0


Here is related stack trace (see https://builds.apache.org/job/HBase-TRUNK/2665/testReport/org.apache.hadoop.hbase.io.hfile/TestForceCacheImportantBlocks/testCacheBlocks_1_/):
{code}
Caused by: java.lang.OutOfMemoryError
	at java.util.zip.Deflater.init(Native Method)
	at java.util.zip.Deflater.<init>(Deflater.java:124)
	at java.util.zip.GZIPOutputStream.<init>(GZIPOutputStream.java:46)
	at java.util.zip.GZIPOutputStream.<init>(GZIPOutputStream.java:58)
	at org.apache.hadoop.hbase.io.hfile.ReusableStreamGzipCodec$ReusableGzipOutputStream$ResetableGZIPOutputStream.<init>(ReusableStreamGzipCodec.java:79)
	at org.apache.hadoop.hbase.io.hfile.ReusableStreamGzipCodec$ReusableGzipOutputStream.<init>(ReusableStreamGzipCodec.java:90)
	at org.apache.hadoop.hbase.io.hfile.ReusableStreamGzipCodec.createOutputStream(ReusableStreamGzipCodec.java:130)
	at org.apache.hadoop.io.compress.GzipCodec.createOutputStream(GzipCodec.java:101)
	at org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.createPlainCompressionStream(Compression.java:239)
	at org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.createCompressionStream(Compression.java:223)
	at org.apache.hadoop.hbase.io.hfile.HFileWriterV1.getCompressingStream(HFileWriterV1.java:270)
	at org.apache.hadoop.hbase.io.hfile.HFileWriterV1.close(HFileWriterV1.java:416)
	at org.apache.hadoop.hbase.regionserver.StoreFile$Writer.close(StoreFile.java:1115)
	at org.apache.hadoop.hbase.regionserver.Store.internalFlushCache(Store.java:706)
	at org.apache.hadoop.hbase.regionserver.Store.flushCache(Store.java:633)
	at org.apache.hadoop.hbase.regionserver.Store.access$400(Store.java:106)
{code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] [Commented] (HBASE-5435) TestForceCacheImportantBlocks fails with OutOfMemoryError

Posted by "Zhihong Yu (Commented) (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/HBASE-5435?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13234615#comment-13234615 ] 

Zhihong Yu commented on HBASE-5435:
-----------------------------------

This failure isn't easily reproducible.
We can move it to 0.96
                
> TestForceCacheImportantBlocks fails with OutOfMemoryError
> ---------------------------------------------------------
>
>                 Key: HBASE-5435
>                 URL: https://issues.apache.org/jira/browse/HBASE-5435
>             Project: HBase
>          Issue Type: Test
>            Reporter: Zhihong Yu
>             Fix For: 0.94.0
>
>
> Here is related stack trace (see https://builds.apache.org/job/HBase-TRUNK/2665/testReport/org.apache.hadoop.hbase.io.hfile/TestForceCacheImportantBlocks/testCacheBlocks_1_/):
> {code}
> Caused by: java.lang.OutOfMemoryError
> 	at java.util.zip.Deflater.init(Native Method)
> 	at java.util.zip.Deflater.<init>(Deflater.java:124)
> 	at java.util.zip.GZIPOutputStream.<init>(GZIPOutputStream.java:46)
> 	at java.util.zip.GZIPOutputStream.<init>(GZIPOutputStream.java:58)
> 	at org.apache.hadoop.hbase.io.hfile.ReusableStreamGzipCodec$ReusableGzipOutputStream$ResetableGZIPOutputStream.<init>(ReusableStreamGzipCodec.java:79)
> 	at org.apache.hadoop.hbase.io.hfile.ReusableStreamGzipCodec$ReusableGzipOutputStream.<init>(ReusableStreamGzipCodec.java:90)
> 	at org.apache.hadoop.hbase.io.hfile.ReusableStreamGzipCodec.createOutputStream(ReusableStreamGzipCodec.java:130)
> 	at org.apache.hadoop.io.compress.GzipCodec.createOutputStream(GzipCodec.java:101)
> 	at org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.createPlainCompressionStream(Compression.java:239)
> 	at org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.createCompressionStream(Compression.java:223)
> 	at org.apache.hadoop.hbase.io.hfile.HFileWriterV1.getCompressingStream(HFileWriterV1.java:270)
> 	at org.apache.hadoop.hbase.io.hfile.HFileWriterV1.close(HFileWriterV1.java:416)
> 	at org.apache.hadoop.hbase.regionserver.StoreFile$Writer.close(StoreFile.java:1115)
> 	at org.apache.hadoop.hbase.regionserver.Store.internalFlushCache(Store.java:706)
> 	at org.apache.hadoop.hbase.regionserver.Store.flushCache(Store.java:633)
> 	at org.apache.hadoop.hbase.regionserver.Store.access$400(Store.java:106)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] [Updated] (HBASE-5435) TestForceCacheImportantBlocks fails with OutOfMemoryError

Posted by "Lars Hofhansl (Updated) (JIRA)" <ji...@apache.org>.
     [ https://issues.apache.org/jira/browse/HBASE-5435?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Lars Hofhansl updated HBASE-5435:
---------------------------------

    Fix Version/s:     (was: 0.94.0)
                   0.96.0
    
> TestForceCacheImportantBlocks fails with OutOfMemoryError
> ---------------------------------------------------------
>
>                 Key: HBASE-5435
>                 URL: https://issues.apache.org/jira/browse/HBASE-5435
>             Project: HBase
>          Issue Type: Test
>            Reporter: Zhihong Yu
>             Fix For: 0.96.0
>
>
> Here is related stack trace (see https://builds.apache.org/job/HBase-TRUNK/2665/testReport/org.apache.hadoop.hbase.io.hfile/TestForceCacheImportantBlocks/testCacheBlocks_1_/):
> {code}
> Caused by: java.lang.OutOfMemoryError
> 	at java.util.zip.Deflater.init(Native Method)
> 	at java.util.zip.Deflater.<init>(Deflater.java:124)
> 	at java.util.zip.GZIPOutputStream.<init>(GZIPOutputStream.java:46)
> 	at java.util.zip.GZIPOutputStream.<init>(GZIPOutputStream.java:58)
> 	at org.apache.hadoop.hbase.io.hfile.ReusableStreamGzipCodec$ReusableGzipOutputStream$ResetableGZIPOutputStream.<init>(ReusableStreamGzipCodec.java:79)
> 	at org.apache.hadoop.hbase.io.hfile.ReusableStreamGzipCodec$ReusableGzipOutputStream.<init>(ReusableStreamGzipCodec.java:90)
> 	at org.apache.hadoop.hbase.io.hfile.ReusableStreamGzipCodec.createOutputStream(ReusableStreamGzipCodec.java:130)
> 	at org.apache.hadoop.io.compress.GzipCodec.createOutputStream(GzipCodec.java:101)
> 	at org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.createPlainCompressionStream(Compression.java:239)
> 	at org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.createCompressionStream(Compression.java:223)
> 	at org.apache.hadoop.hbase.io.hfile.HFileWriterV1.getCompressingStream(HFileWriterV1.java:270)
> 	at org.apache.hadoop.hbase.io.hfile.HFileWriterV1.close(HFileWriterV1.java:416)
> 	at org.apache.hadoop.hbase.regionserver.StoreFile$Writer.close(StoreFile.java:1115)
> 	at org.apache.hadoop.hbase.regionserver.Store.internalFlushCache(Store.java:706)
> 	at org.apache.hadoop.hbase.regionserver.Store.flushCache(Store.java:633)
> 	at org.apache.hadoop.hbase.regionserver.Store.access$400(Store.java:106)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] [Commented] (HBASE-5435) TestForceCacheImportantBlocks fails with OutOfMemoryError

Posted by "Lars Hofhansl (Commented) (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/HBASE-5435?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13234599#comment-13234599 ] 

Lars Hofhansl commented on HBASE-5435:
--------------------------------------

Any idea for a fix, Ted. If not, I'll push to 0.96.
                
> TestForceCacheImportantBlocks fails with OutOfMemoryError
> ---------------------------------------------------------
>
>                 Key: HBASE-5435
>                 URL: https://issues.apache.org/jira/browse/HBASE-5435
>             Project: HBase
>          Issue Type: Test
>            Reporter: Zhihong Yu
>             Fix For: 0.94.0
>
>
> Here is related stack trace (see https://builds.apache.org/job/HBase-TRUNK/2665/testReport/org.apache.hadoop.hbase.io.hfile/TestForceCacheImportantBlocks/testCacheBlocks_1_/):
> {code}
> Caused by: java.lang.OutOfMemoryError
> 	at java.util.zip.Deflater.init(Native Method)
> 	at java.util.zip.Deflater.<init>(Deflater.java:124)
> 	at java.util.zip.GZIPOutputStream.<init>(GZIPOutputStream.java:46)
> 	at java.util.zip.GZIPOutputStream.<init>(GZIPOutputStream.java:58)
> 	at org.apache.hadoop.hbase.io.hfile.ReusableStreamGzipCodec$ReusableGzipOutputStream$ResetableGZIPOutputStream.<init>(ReusableStreamGzipCodec.java:79)
> 	at org.apache.hadoop.hbase.io.hfile.ReusableStreamGzipCodec$ReusableGzipOutputStream.<init>(ReusableStreamGzipCodec.java:90)
> 	at org.apache.hadoop.hbase.io.hfile.ReusableStreamGzipCodec.createOutputStream(ReusableStreamGzipCodec.java:130)
> 	at org.apache.hadoop.io.compress.GzipCodec.createOutputStream(GzipCodec.java:101)
> 	at org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.createPlainCompressionStream(Compression.java:239)
> 	at org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.createCompressionStream(Compression.java:223)
> 	at org.apache.hadoop.hbase.io.hfile.HFileWriterV1.getCompressingStream(HFileWriterV1.java:270)
> 	at org.apache.hadoop.hbase.io.hfile.HFileWriterV1.close(HFileWriterV1.java:416)
> 	at org.apache.hadoop.hbase.regionserver.StoreFile$Writer.close(StoreFile.java:1115)
> 	at org.apache.hadoop.hbase.regionserver.Store.internalFlushCache(Store.java:706)
> 	at org.apache.hadoop.hbase.regionserver.Store.flushCache(Store.java:633)
> 	at org.apache.hadoop.hbase.regionserver.Store.access$400(Store.java:106)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira

        

[jira] [Commented] (HBASE-5435) TestForceCacheImportantBlocks fails with OutOfMemoryError

Posted by "Zhihong Yu (Commented) (JIRA)" <ji...@apache.org>.
    [ https://issues.apache.org/jira/browse/HBASE-5435?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13244276#comment-13244276 ] 

Zhihong Yu commented on HBASE-5435:
-----------------------------------

The test error happened in 0.94 build #75 as well.
https://builds.apache.org/job/HBase-0.94/75/testReport/junit/org.apache.hadoop.hbase.io.hfile/TestForceCacheImportantBlocks/testCacheBlocks_1_/
                
> TestForceCacheImportantBlocks fails with OutOfMemoryError
> ---------------------------------------------------------
>
>                 Key: HBASE-5435
>                 URL: https://issues.apache.org/jira/browse/HBASE-5435
>             Project: HBase
>          Issue Type: Test
>            Reporter: Zhihong Yu
>             Fix For: 0.96.0
>
>
> Here is related stack trace (see https://builds.apache.org/job/HBase-TRUNK/2665/testReport/org.apache.hadoop.hbase.io.hfile/TestForceCacheImportantBlocks/testCacheBlocks_1_/):
> {code}
> Caused by: java.lang.OutOfMemoryError
> 	at java.util.zip.Deflater.init(Native Method)
> 	at java.util.zip.Deflater.<init>(Deflater.java:124)
> 	at java.util.zip.GZIPOutputStream.<init>(GZIPOutputStream.java:46)
> 	at java.util.zip.GZIPOutputStream.<init>(GZIPOutputStream.java:58)
> 	at org.apache.hadoop.hbase.io.hfile.ReusableStreamGzipCodec$ReusableGzipOutputStream$ResetableGZIPOutputStream.<init>(ReusableStreamGzipCodec.java:79)
> 	at org.apache.hadoop.hbase.io.hfile.ReusableStreamGzipCodec$ReusableGzipOutputStream.<init>(ReusableStreamGzipCodec.java:90)
> 	at org.apache.hadoop.hbase.io.hfile.ReusableStreamGzipCodec.createOutputStream(ReusableStreamGzipCodec.java:130)
> 	at org.apache.hadoop.io.compress.GzipCodec.createOutputStream(GzipCodec.java:101)
> 	at org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.createPlainCompressionStream(Compression.java:239)
> 	at org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.createCompressionStream(Compression.java:223)
> 	at org.apache.hadoop.hbase.io.hfile.HFileWriterV1.getCompressingStream(HFileWriterV1.java:270)
> 	at org.apache.hadoop.hbase.io.hfile.HFileWriterV1.close(HFileWriterV1.java:416)
> 	at org.apache.hadoop.hbase.regionserver.StoreFile$Writer.close(StoreFile.java:1115)
> 	at org.apache.hadoop.hbase.regionserver.Store.internalFlushCache(Store.java:706)
> 	at org.apache.hadoop.hbase.regionserver.Store.flushCache(Store.java:633)
> 	at org.apache.hadoop.hbase.regionserver.Store.access$400(Store.java:106)
> {code}

--
This message is automatically generated by JIRA.
If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa
For more information on JIRA, see: http://www.atlassian.com/software/jira