You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Hari Sreekumar <hs...@clickable.com> on 2011/02/19 21:04:35 UTC

Problem running LZO compression test

Hi,

I have installed lzo and copied the hadoop-gpl-compression.jar file to
$HADOOP_HOME/lib and $HBASE_HOME/lib. I have also copied
libgplcompression.la to $HADOOP_HOME/lib/native/Linux-amd../ and
to $HBASE_HOME/lib/native/Linux-amd../

I am now able to create new columnfamilies with lzo compression, but when I
run the CompressionTest class, I get this error:

> $HBASE_HOME/bin/hbase org.apache.hadoop.hbase.util.CompressionTest
hdfs://hadoop1:54310 lzo

11/02/20 01:32:15 ERROR lzo.GPLNativeCodeLoader: Could not load native gpl
library
java.lang.UnsatisfiedLinkError: no gplcompression in java.library.path
        at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1734)
        at java.lang.Runtime.loadLibrary0(Runtime.java:823)
        at java.lang.System.loadLibrary(System.java:1028)
        at
com.hadoop.compression.lzo.GPLNativeCodeLoader.<clinit>(GPLNativeCodeLoader.java:31)
        at
com.hadoop.compression.lzo.LzoCodec.isNativeLzoLoaded(LzoCodec.java:69)
        at
com.hadoop.compression.lzo.LzoCodec.getCompressorType(LzoCodec.java:146)
        at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:98)
        at
org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getCompressor(Compression.java:199)
        at
org.apache.hadoop.hbase.io.hfile.HFile$Writer.getCompressingStream(HFile.java:387)
        at
org.apache.hadoop.hbase.io.hfile.HFile$Writer.newBlock(HFile.java:373)
        at
org.apache.hadoop.hbase.io.hfile.HFile$Writer.checkBlockBoundary(HFile.java:344)
        at
org.apache.hadoop.hbase.io.hfile.HFile$Writer.append(HFile.java:515)
        at
org.apache.hadoop.hbase.io.hfile.HFile$Writer.append(HFile.java:495)
        at
org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:69)
11/02/20 01:32:15 ERROR lzo.LzoCodec: Cannot load native-lzo without
native-hadoop
java.lang.RuntimeException: native-lzo library not available
        at
com.hadoop.compression.lzo.LzoCodec.getCompressorType(LzoCodec.java:147)
        at
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:98)
        at
org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getCompressor(Compression.java:199)
        at
org.apache.hadoop.hbase.io.hfile.HFile$Writer.getCompressingStream(HFile.java:387)
        at
org.apache.hadoop.hbase.io.hfile.HFile$Writer.newBlock(HFile.java:373)
        at
org.apache.hadoop.hbase.io.hfile.HFile$Writer.checkBlockBoundary(HFile.java:344)
        at
org.apache.hadoop.hbase.io.hfile.HFile$Writer.append(HFile.java:515)
        at
org.apache.hadoop.hbase.io.hfile.HFile$Writer.append(HFile.java:495)
        at
org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:69)

What could be the reason for this and how do I fix it?

Thanks,
Hari

Re: Problem running LZO compression test

Posted by Hari Sreekumar <hs...@clickable.com>.
Solved it..
I was copying only one native library file (The .la file) You need to copy
all the files (.so too). The correct command is

tar -cBf - -C build/hadoop-gpl-compression-0.1.0-dev/lib/native . | tar -xBvf
- -C /path/to/hadoop/dist/lib/native

as mentioned in http://code.google.com/p/hadoop-gpl-compression/wiki/FAQ

<http://code.google.com/p/hadoop-gpl-compression/wiki/FAQ>Hari

On Sun, Feb 20, 2011 at 1:34 AM, Hari Sreekumar <hs...@clickable.com>wrote:

> Hi,
>
> I have installed lzo and copied the hadoop-gpl-compression.jar file to
> $HADOOP_HOME/lib and $HBASE_HOME/lib. I have also copied
> libgplcompression.la to $HADOOP_HOME/lib/native/Linux-amd../ and
> to $HBASE_HOME/lib/native/Linux-amd../
>
> I am now able to create new columnfamilies with lzo compression, but when I
> run the CompressionTest class, I get this error:
>
> > $HBASE_HOME/bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> hdfs://hadoop1:54310 lzo
>
> 11/02/20 01:32:15 ERROR lzo.GPLNativeCodeLoader: Could not load native gpl
> library
> java.lang.UnsatisfiedLinkError: no gplcompression in java.library.path
>         at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1734)
>         at java.lang.Runtime.loadLibrary0(Runtime.java:823)
>         at java.lang.System.loadLibrary(System.java:1028)
>         at
> com.hadoop.compression.lzo.GPLNativeCodeLoader.<clinit>(GPLNativeCodeLoader.java:31)
>         at
> com.hadoop.compression.lzo.LzoCodec.isNativeLzoLoaded(LzoCodec.java:69)
>         at
> com.hadoop.compression.lzo.LzoCodec.getCompressorType(LzoCodec.java:146)
>         at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:98)
>         at
> org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getCompressor(Compression.java:199)
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$Writer.getCompressingStream(HFile.java:387)
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$Writer.newBlock(HFile.java:373)
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$Writer.checkBlockBoundary(HFile.java:344)
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$Writer.append(HFile.java:515)
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$Writer.append(HFile.java:495)
>         at
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:69)
> 11/02/20 01:32:15 ERROR lzo.LzoCodec: Cannot load native-lzo without
> native-hadoop
> java.lang.RuntimeException: native-lzo library not available
>         at
> com.hadoop.compression.lzo.LzoCodec.getCompressorType(LzoCodec.java:147)
>         at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:98)
>         at
> org.apache.hadoop.hbase.io.hfile.Compression$Algorithm.getCompressor(Compression.java:199)
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$Writer.getCompressingStream(HFile.java:387)
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$Writer.newBlock(HFile.java:373)
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$Writer.checkBlockBoundary(HFile.java:344)
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$Writer.append(HFile.java:515)
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$Writer.append(HFile.java:495)
>         at
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:69)
>
> What could be the reason for this and how do I fix it?
>
> Thanks,
> Hari
>