You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Peter Naudus <pn...@dataraker.com> on 2012/02/28 22:52:00 UTC

java.io.IOException: Compression algorithm 'snappy' previously failed test

Hello All,

I am using HBase-0.92.0 and Hadoop-0.23.0 . When attempting to create a  
table with snappy compression it gets stuck in the PENDING_OPEN state with  
the following message repeated in the region server's log file:

2012-02-28 21:08:18,010 INFO  
org.apache.hadoop.hbase.regionserver.HRegionServer: Received request to  
open region: staging_dev,,1330462814322.1300b0adf5bfad3a8aa4d88326802171.
2012-02-28 21:08:18,010 DEBUG org.apache.hadoop.hbase.zookeeper.ZKAssign:  
regionserver:60020-0x35c5c0fb080003 Attempting to transition node  
1300b0adf5bfad3a8aa4d88326802171 from M_ZK_REGION_OFFLINE to  
RS_ZK_REGION_OPENING
2012-02-28 21:08:18,017 DEBUG org.apache.hadoop.hbase.zookeeper.ZKAssign:  
regionserver:60020-0x35c5c0fb080003 Successfully transitioned node  
1300b0adf5bfad3a8aa4d88326802171 from M_ZK_REGION_OFFLINE to  
RS_ZK_REGION_OPENING
2012-02-28 21:08:18,018 DEBUG  
org.apache.hadoop.hbase.regionserver.HRegion: Opening region: {NAME =>  
'staging_dev,,1330462814322.1300b0adf5bfad3a8aa4d88326802171.', STARTKEY  
=> '', ENDKEY => '', ENCODED => 1300b0adf5bfad3a8aa4d88326802171,}
2012-02-28 21:08:18,018 INFO org.apache.hadoop.hbase.regionserver.HRegion:  
Setting up tabledescriptor config now ...
2012-02-28 21:08:18,018 DEBUG  
org.apache.hadoop.hbase.regionserver.HRegion: Instantiated  
staging_dev,,1330462814322.1300b0adf5bfad3a8aa4d88326802171.
2012-02-28 21:08:18,018 ERROR  
org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler: Failed  
open of region=staging_dev,,1330462814322.1300b0adf5bfad3a8aa4d88326802171.
java.io.IOException: Compression algorithm 'snappy' previously failed test.
         at  
org.apache.hadoop.hbase.util.CompressionTest.testCompression(CompressionTest.java:78)
         at  
org.apache.hadoop.hbase.regionserver.HRegion.checkCompressionCodecs(HRegion.java:3234)
         at  
org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:3223)
         at  
org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:3173)
         at  
org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.openRegion(OpenRegionHandler.java:331)
         at  
org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler.process(OpenRegionHandler.java:107)
         at  
org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:169)
         at  
java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
         at  
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
         at java.lang.Thread.run(Thread.java:662)
2012-02-28 21:08:18,018 INFO  
org.apache.hadoop.hbase.regionserver.handler.OpenRegionHandler: Opening of  
region {NAME =>  
'staging_dev,,1330462814322.1300b0adf5bfad3a8aa4d88326802171.', STARTKEY  
=> '', ENDKEY => '', ENCODED => 1300b0adf5bfad3a8aa4d88326802171,} failed,  
marking as FAILED_OPEN in ZK
2012-02-28 21:08:18,018 DEBUG org.apache.hadoop.hbase.zookeeper.ZKAssign:  
regionserver:60020-0x35c5c0fb080003 Attempting to transition node  
1300b0adf5bfad3a8aa4d88326802171 from RS_ZK_REGION_OPENING to  
RS_ZK_REGION_FAILED_OPEN
2012-02-28 21:08:18,054 DEBUG org.apache.hadoop.hbase.zookeeper.ZKAssign:  
regionserver:60020-0x35c5c0fb080003 Successfully transitioned node  
1300b0adf5bfad3a8aa4d88326802171 from RS_ZK_REGION_OPENING to  
RS_ZK_REGION_FAILED_OPEN

I tried setting HBASE_LIBRARY_PATH ( and HBASE_CLASSPATH just in case ) to  
point to the directory containing libsnappy but I am still getting this  
error

What else can I do to fix / diagnose this problem?

Thanks!

~ Peter

Re: java.io.IOException: Compression algorithm 'snappy' previously failed test

Posted by Peter Naudus <pn...@dataraker.com>.
Hi Nathaniel,

My memory's a little hazy (we ended going back to CDH3), but I believe the  
key to fixing this problem for me was the following log message:

> WARN util.NativeCodeLoader: Unable to load native-hadoop library for
> your platform... using builtin-java classes where applicable

I had set HBASE_LIBRARY_PATH to hadoop's library path since that was where  
snappy was. Once I set the path ( or linked created symbolic links ) so  
that HBASE_LIBRARY_PATH included both HBase's and Hadoop's the snappy  
issue was resolved.

Also, in general, I ended up having a lot better luck with the RPMs  
instead of the tarballs.

~ Peter

On Mon, 23 Apr 2012 11:01:19 -0400, Nathaniel Cook <nv...@gmail.com>  
wrote:

> Was there any resolution to this? I am experiencing the same issue.
>
> Nathaniel
>
> On Wed, Feb 29, 2012 at 10:52 AM, Peter Naudus <pn...@dataraker.com>  
> wrote:
>> Thanks for your help :)
>>
>> To make sure I manually set LD_LIBRARY_PATH, LIBRARY_PATH, and
>> HBASE_LIBRARY_PATH
>>
>> bash-3.2$ export
>> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/dataraker/software/cdh4/hadoop-0.23.0-cdh4b1/lib/native
>> bash-3.2$ export
>> LIBRARY_PATH=$LIBRARY_PATH:/opt/dataraker/software/cdh4/hadoop-0.23.0-cdh4b1/lib/native
>> bash-3.2$ export
>> HBASE_LIBRARY_PATH=/opt/dataraker/software/cdh4/hadoop-0.23.0-cdh4b1/lib/native
>>
>> But running the compression test failed with "native snappy library not
>> available"
>>
>> bash-3.2$ ./hbase org.apache.hadoop.hbase.util.CompressionTest
>> file:///tmp/test.txt snappy
>> log4j:WARN No appenders could be found for logger
>> (org.apache.hadoop.conf.Configuration).
>> log4j:WARN Please initialize the log4j system properly.
>> log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
>> more info.
>> Exception in thread "main" java.lang.RuntimeException: native snappy  
>> library
>> not available
>>        at
>> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:121)
>>        at
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:104)
>>        at
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:118)
>>        at
>> org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getCompressor(Compression.java:236)
>>        at
>> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:588)
>>        at
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:178)
>>        at
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:150)
>>        at
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:140)
>>        at
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:104)
>>        at
>> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:108)
>>        at
>> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:137)
>>
>> I verified that libsnappy is indeed, installed
>>
>> bash-3.2$ ls -al $HBASE_LIBRARY_PATH
>> total 1412
>> drwxr-xr-x 2 1106 592   4096 Feb 11 01:06 .
>> drwxr-xr-x 3 1106 592   4096 Feb 11 01:06 ..
>> -rw-r--r-- 1 1106 592 616862 Feb 11 01:06 libhadoop.a
>> -rwxr-xr-x 1 1106 592   1051 Feb 11 01:06 libhadoop.la
>> lrwxrwxrwx 1 1106 592     18 Feb 27 18:12 libhadoop.so ->  
>> libhadoop.so.1.0.0
>> lrwxrwxrwx 1 1106 592     18 Feb 27 18:12 libhadoop.so.1 ->
>> libhadoop.so.1.0.0
>> -rwxr-xr-x 1 1106 592 340361 Feb 11 01:06 libhadoop.so.1.0.0
>> -rw-r--r-- 1 1106 592 184418 Feb 11 01:06 libhdfs.a
>> -rwxr-xr-x 1 1106 592   1034 Feb 11 01:06 libhdfs.la
>> lrwxrwxrwx 1 1106 592     16 Feb 27 18:12 libhdfs.so -> libhdfs.so.0.0.0
>> lrwxrwxrwx 1 1106 592     16 Feb 27 18:12 libhdfs.so.0 ->  
>> libhdfs.so.0.0.0
>> -rwxr-xr-x 1 1106 592 125455 Feb 11 01:06 libhdfs.so.0.0.0
>> -rw-r--r-- 1 1106 592  37392 Feb 11 01:06 libsnappy.a
>> lrwxrwxrwx 1 1106 592     18 Feb 27 18:12 libsnappy.so ->  
>> libsnappy.so.1.1.1
>> lrwxrwxrwx 1 1106 592     18 Feb 27 18:12 libsnappy.so.1 ->
>> libsnappy.so.1.1.1
>> -rw-r--r-- 1 1106 592  26824 Feb 11 01:06 libsnappy.so.1.1.1
>>
>> Just for grins and giggles I re-ran this as root
>>
>> In addition to the Exception mentioned above, I also got following  
>> Warning:
>>        WARN util.NativeCodeLoader: Unable to load native-hadoop library  
>> for
>> your platform... using builtin-java classes where applicable
>>
>> Any ideas?
>>
>>
>> On Tue, 28 Feb 2012 20:02:38 -0500, Stack <st...@duboce.net> wrote:
>>
>>> On Tue, Feb 28, 2012 at 1:52 PM, Peter Naudus <pn...@dataraker.com>
>>> wrote:
>>>>
>>>> What else can I do to fix / diagnose this problem?
>>>>
>>>
>>> Does our little compression tool help?
>>> http://hbase.apache.org/book.html#compression.test
>>>
>>> St.Ack
>>
>>
>>
>> --

Re: java.io.IOException: Compression algorithm 'snappy' previously failed test

Posted by Nathaniel Cook <nv...@gmail.com>.
Was there any resolution to this? I am experiencing the same issue.

Nathaniel

On Wed, Feb 29, 2012 at 10:52 AM, Peter Naudus <pn...@dataraker.com> wrote:
> Thanks for your help :)
>
> To make sure I manually set LD_LIBRARY_PATH, LIBRARY_PATH, and
> HBASE_LIBRARY_PATH
>
> bash-3.2$ export
> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/dataraker/software/cdh4/hadoop-0.23.0-cdh4b1/lib/native
> bash-3.2$ export
> LIBRARY_PATH=$LIBRARY_PATH:/opt/dataraker/software/cdh4/hadoop-0.23.0-cdh4b1/lib/native
> bash-3.2$ export
> HBASE_LIBRARY_PATH=/opt/dataraker/software/cdh4/hadoop-0.23.0-cdh4b1/lib/native
>
> But running the compression test failed with "native snappy library not
> available"
>
> bash-3.2$ ./hbase org.apache.hadoop.hbase.util.CompressionTest
> file:///tmp/test.txt snappy
> log4j:WARN No appenders could be found for logger
> (org.apache.hadoop.conf.Configuration).
> log4j:WARN Please initialize the log4j system properly.
> log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
> more info.
> Exception in thread "main" java.lang.RuntimeException: native snappy library
> not available
>        at
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:121)
>        at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:104)
>        at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:118)
>        at
> org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getCompressor(Compression.java:236)
>        at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:588)
>        at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:178)
>        at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:150)
>        at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:140)
>        at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:104)
>        at
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:108)
>        at
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:137)
>
> I verified that libsnappy is indeed, installed
>
> bash-3.2$ ls -al $HBASE_LIBRARY_PATH
> total 1412
> drwxr-xr-x 2 1106 592   4096 Feb 11 01:06 .
> drwxr-xr-x 3 1106 592   4096 Feb 11 01:06 ..
> -rw-r--r-- 1 1106 592 616862 Feb 11 01:06 libhadoop.a
> -rwxr-xr-x 1 1106 592   1051 Feb 11 01:06 libhadoop.la
> lrwxrwxrwx 1 1106 592     18 Feb 27 18:12 libhadoop.so -> libhadoop.so.1.0.0
> lrwxrwxrwx 1 1106 592     18 Feb 27 18:12 libhadoop.so.1 ->
> libhadoop.so.1.0.0
> -rwxr-xr-x 1 1106 592 340361 Feb 11 01:06 libhadoop.so.1.0.0
> -rw-r--r-- 1 1106 592 184418 Feb 11 01:06 libhdfs.a
> -rwxr-xr-x 1 1106 592   1034 Feb 11 01:06 libhdfs.la
> lrwxrwxrwx 1 1106 592     16 Feb 27 18:12 libhdfs.so -> libhdfs.so.0.0.0
> lrwxrwxrwx 1 1106 592     16 Feb 27 18:12 libhdfs.so.0 -> libhdfs.so.0.0.0
> -rwxr-xr-x 1 1106 592 125455 Feb 11 01:06 libhdfs.so.0.0.0
> -rw-r--r-- 1 1106 592  37392 Feb 11 01:06 libsnappy.a
> lrwxrwxrwx 1 1106 592     18 Feb 27 18:12 libsnappy.so -> libsnappy.so.1.1.1
> lrwxrwxrwx 1 1106 592     18 Feb 27 18:12 libsnappy.so.1 ->
> libsnappy.so.1.1.1
> -rw-r--r-- 1 1106 592  26824 Feb 11 01:06 libsnappy.so.1.1.1
>
> Just for grins and giggles I re-ran this as root
>
> In addition to the Exception mentioned above, I also got following Warning:
>        WARN util.NativeCodeLoader: Unable to load native-hadoop library for
> your platform... using builtin-java classes where applicable
>
> Any ideas?
>
>
> On Tue, 28 Feb 2012 20:02:38 -0500, Stack <st...@duboce.net> wrote:
>
>> On Tue, Feb 28, 2012 at 1:52 PM, Peter Naudus <pn...@dataraker.com>
>> wrote:
>>>
>>> What else can I do to fix / diagnose this problem?
>>>
>>
>> Does our little compression tool help?
>> http://hbase.apache.org/book.html#compression.test
>>
>> St.Ack
>
>
>
> --

Re: java.io.IOException: Compression algorithm 'snappy' previously failed test

Posted by Peter Naudus <pn...@dataraker.com>.
Thanks for your help :)

To make sure I manually set LD_LIBRARY_PATH, LIBRARY_PATH, and  
HBASE_LIBRARY_PATH

bash-3.2$ export  
LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/opt/dataraker/software/cdh4/hadoop-0.23.0-cdh4b1/lib/native
bash-3.2$ export  
LIBRARY_PATH=$LIBRARY_PATH:/opt/dataraker/software/cdh4/hadoop-0.23.0-cdh4b1/lib/native
bash-3.2$ export  
HBASE_LIBRARY_PATH=/opt/dataraker/software/cdh4/hadoop-0.23.0-cdh4b1/lib/native

But running the compression test failed with "native snappy library not  
available"

bash-3.2$ ./hbase org.apache.hadoop.hbase.util.CompressionTest  
file:///tmp/test.txt snappy
log4j:WARN No appenders could be found for logger  
(org.apache.hadoop.conf.Configuration).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for  
more info.
Exception in thread "main" java.lang.RuntimeException: native snappy  
library not available
	at  
org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:121)
	at  
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:104)
	at  
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:118)
	at  
org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getCompressor(Compression.java:236)
	at  
org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:588)
	at  
org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:178)
	at  
org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:150)
	at  
org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:140)
	at  
org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:104)
	at  
org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:108)
	at  
org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:137)

I verified that libsnappy is indeed, installed

bash-3.2$ ls -al $HBASE_LIBRARY_PATH
total 1412
drwxr-xr-x 2 1106 592   4096 Feb 11 01:06 .
drwxr-xr-x 3 1106 592   4096 Feb 11 01:06 ..
-rw-r--r-- 1 1106 592 616862 Feb 11 01:06 libhadoop.a
-rwxr-xr-x 1 1106 592   1051 Feb 11 01:06 libhadoop.la
lrwxrwxrwx 1 1106 592     18 Feb 27 18:12 libhadoop.so ->  
libhadoop.so.1.0.0
lrwxrwxrwx 1 1106 592     18 Feb 27 18:12 libhadoop.so.1 ->  
libhadoop.so.1.0.0
-rwxr-xr-x 1 1106 592 340361 Feb 11 01:06 libhadoop.so.1.0.0
-rw-r--r-- 1 1106 592 184418 Feb 11 01:06 libhdfs.a
-rwxr-xr-x 1 1106 592   1034 Feb 11 01:06 libhdfs.la
lrwxrwxrwx 1 1106 592     16 Feb 27 18:12 libhdfs.so -> libhdfs.so.0.0.0
lrwxrwxrwx 1 1106 592     16 Feb 27 18:12 libhdfs.so.0 -> libhdfs.so.0.0.0
-rwxr-xr-x 1 1106 592 125455 Feb 11 01:06 libhdfs.so.0.0.0
-rw-r--r-- 1 1106 592  37392 Feb 11 01:06 libsnappy.a
lrwxrwxrwx 1 1106 592     18 Feb 27 18:12 libsnappy.so ->  
libsnappy.so.1.1.1
lrwxrwxrwx 1 1106 592     18 Feb 27 18:12 libsnappy.so.1 ->  
libsnappy.so.1.1.1
-rw-r--r-- 1 1106 592  26824 Feb 11 01:06 libsnappy.so.1.1.1

Just for grins and giggles I re-ran this as root

In addition to the Exception mentioned above, I also got following Warning:
	WARN util.NativeCodeLoader: Unable to load native-hadoop library for your  
platform... using builtin-java classes where applicable

Any ideas?

On Tue, 28 Feb 2012 20:02:38 -0500, Stack <st...@duboce.net> wrote:

> On Tue, Feb 28, 2012 at 1:52 PM, Peter Naudus <pn...@dataraker.com>  
> wrote:
>> What else can I do to fix / diagnose this problem?
>>
>
> Does our little compression tool help?
> http://hbase.apache.org/book.html#compression.test
>
> St.Ack


--

Re: java.io.IOException: Compression algorithm 'snappy' previously failed test

Posted by Stack <st...@duboce.net>.
On Tue, Feb 28, 2012 at 1:52 PM, Peter Naudus <pn...@dataraker.com> wrote:
> What else can I do to fix / diagnose this problem?
>

Does our little compression tool help?
http://hbase.apache.org/book.html#compression.test

St.Ack