You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Amit Sela <am...@infolinks.com> on 2014/01/01 17:05:09 UTC

Setting up Snappy compression in Hadoop

Hi all,

I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
compression.
I'm adding the configurations:

configuration.setBoolean("mapred.compress.map.output", true);
configuration.set("mapred.map.output.compression.codec",
"org.apache.hadoop.io.compress.SnappyCodec");

And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/

Still, all map tasks fail with "native snappy library not available".

Could anyone elaborate on how to install Snappy for Hadoop ?

Thanks,

Amit.

RE: Setting up Snappy compression in Hadoop

Posted by German Florez-Larrahondo <ge...@samsung.com>.
Amit

You may also need to check whether the native library you are using includes
Snappy or not. 

 

For example, when you compile from source and the libsnappy.so is not found
then snappy support is not included as part of the native library for Hadoop
(for force it to fail if libsnappy is not found the require.snappy flag is
required).

 

A quick test could be this:

 

htf@german:~/hadoop/lib/native$ nm libhadoop.so  | grep -i snappy

0000000000005e10 T
Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_compressBytesDire
ct

0000000000005db0 T
Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_getLibraryName

0000000000006200 T
Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_initIDs

0000000000006450 T
Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_decompressBytes
Direct

0000000000006860 T
Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_initIDs

000000000000d030 T
Java_org_apache_hadoop_util_NativeCodeLoader_buildSupportsSnappy

0000000000215958 b SnappyCompressor_clazz

0000000000215970 b SnappyCompressor_compressedDirectBuf

0000000000215978 b SnappyCompressor_directBufferSize

0000000000215960 b SnappyCompressor_uncompressedDirectBuf

0000000000215968 b SnappyCompressor_uncompressedDirectBufLen

0000000000215988 b SnappyDecompressor_clazz

0000000000215990 b SnappyDecompressor_compressedDirectBuf

0000000000215998 b SnappyDecompressor_compressedDirectBufLen

00000000002159a8 b SnappyDecompressor_directBufferSize

00000000002159a0 b SnappyDecompressor_uncompressedDirectBuf

0000000000215980 b dlsym_snappy_compress

00000000002159b0 b dlsym_snappy_uncompress

 

If you don't see any snappy-related objects in the library, then it hasn't
been compiled with Snappy support. 

 

Note the info I give you is based on recent Hadoop releases (like 2.2.0),
but something similar should apply for your release.

 

Regards

.g

 

From: bharath vissapragada [mailto:bharathvissapragada1990@gmail.com] 
Sent: Thursday, January 02, 2014 5:56 AM
To: User
Subject: Re: Setting up Snappy compression in Hadoop

 

Your natives should be in LD_LIBRARY_PATH or java.library.path for hadoop to
pick them up. You can try adding export HADOOP_OPTS=$HADOOP_OPTS
-Djava.library.path=<Path to your natives lib> to hadoop-env.sh in TTs and
clients/gateways and restart TTs and give it another try. The reason its
working for Hbase is you are manually pointing HBASE_LIBRARY_PATH to the
natives. My guess is they are in a wrong location.

 

On Thu, Jan 2, 2014 at 5:07 PM, Amit Sela <am...@infolinks.com> wrote:

I did everything mentioned in the link Ted mentioned, and the test actually
works, but using Snappy for MapReduce map output compression still fails
with "native snappy library not available".

 

On Wed, Jan 1, 2014 at 6:37 PM, bharath vissapragada
<bh...@gmail.com> wrote:

Did you build it for your platform? You can do an "ldd" on the .so file to
check if the dependent libs are present. Also make sure you placed it in the
right directory for your platform (Linux-amd64-64 or Linux-i386-32)

 

On Wed, Jan 1, 2014 at 10:02 PM, Ted Yu <yu...@gmail.com> wrote:

Please take a look at http://hbase.apache.org/book.html#snappy.compression

 

Cheers

 

On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:

Hi all, 

 

I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
compression.

I'm adding the configurations:

 

configuration.setBoolean("mapred.compress.map.output", true);

configuration.set("mapred.map.output.compression.codec",
"org.apache.hadoop.io.compress.SnappyCodec");

 

And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/

 

Still, all map tasks fail with "native snappy library not available".

 

Could anyone elaborate on how to install Snappy for Hadoop ?

 

Thanks, 

 

Amit.

 

 

 

 


RE: Setting up Snappy compression in Hadoop

Posted by German Florez-Larrahondo <ge...@samsung.com>.
Amit

You may also need to check whether the native library you are using includes
Snappy or not. 

 

For example, when you compile from source and the libsnappy.so is not found
then snappy support is not included as part of the native library for Hadoop
(for force it to fail if libsnappy is not found the require.snappy flag is
required).

 

A quick test could be this:

 

htf@german:~/hadoop/lib/native$ nm libhadoop.so  | grep -i snappy

0000000000005e10 T
Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_compressBytesDire
ct

0000000000005db0 T
Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_getLibraryName

0000000000006200 T
Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_initIDs

0000000000006450 T
Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_decompressBytes
Direct

0000000000006860 T
Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_initIDs

000000000000d030 T
Java_org_apache_hadoop_util_NativeCodeLoader_buildSupportsSnappy

0000000000215958 b SnappyCompressor_clazz

0000000000215970 b SnappyCompressor_compressedDirectBuf

0000000000215978 b SnappyCompressor_directBufferSize

0000000000215960 b SnappyCompressor_uncompressedDirectBuf

0000000000215968 b SnappyCompressor_uncompressedDirectBufLen

0000000000215988 b SnappyDecompressor_clazz

0000000000215990 b SnappyDecompressor_compressedDirectBuf

0000000000215998 b SnappyDecompressor_compressedDirectBufLen

00000000002159a8 b SnappyDecompressor_directBufferSize

00000000002159a0 b SnappyDecompressor_uncompressedDirectBuf

0000000000215980 b dlsym_snappy_compress

00000000002159b0 b dlsym_snappy_uncompress

 

If you don't see any snappy-related objects in the library, then it hasn't
been compiled with Snappy support. 

 

Note the info I give you is based on recent Hadoop releases (like 2.2.0),
but something similar should apply for your release.

 

Regards

.g

 

From: bharath vissapragada [mailto:bharathvissapragada1990@gmail.com] 
Sent: Thursday, January 02, 2014 5:56 AM
To: User
Subject: Re: Setting up Snappy compression in Hadoop

 

Your natives should be in LD_LIBRARY_PATH or java.library.path for hadoop to
pick them up. You can try adding export HADOOP_OPTS=$HADOOP_OPTS
-Djava.library.path=<Path to your natives lib> to hadoop-env.sh in TTs and
clients/gateways and restart TTs and give it another try. The reason its
working for Hbase is you are manually pointing HBASE_LIBRARY_PATH to the
natives. My guess is they are in a wrong location.

 

On Thu, Jan 2, 2014 at 5:07 PM, Amit Sela <am...@infolinks.com> wrote:

I did everything mentioned in the link Ted mentioned, and the test actually
works, but using Snappy for MapReduce map output compression still fails
with "native snappy library not available".

 

On Wed, Jan 1, 2014 at 6:37 PM, bharath vissapragada
<bh...@gmail.com> wrote:

Did you build it for your platform? You can do an "ldd" on the .so file to
check if the dependent libs are present. Also make sure you placed it in the
right directory for your platform (Linux-amd64-64 or Linux-i386-32)

 

On Wed, Jan 1, 2014 at 10:02 PM, Ted Yu <yu...@gmail.com> wrote:

Please take a look at http://hbase.apache.org/book.html#snappy.compression

 

Cheers

 

On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:

Hi all, 

 

I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
compression.

I'm adding the configurations:

 

configuration.setBoolean("mapred.compress.map.output", true);

configuration.set("mapred.map.output.compression.codec",
"org.apache.hadoop.io.compress.SnappyCodec");

 

And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/

 

Still, all map tasks fail with "native snappy library not available".

 

Could anyone elaborate on how to install Snappy for Hadoop ?

 

Thanks, 

 

Amit.

 

 

 

 


RE: Setting up Snappy compression in Hadoop

Posted by German Florez-Larrahondo <ge...@samsung.com>.
Amit

You may also need to check whether the native library you are using includes
Snappy or not. 

 

For example, when you compile from source and the libsnappy.so is not found
then snappy support is not included as part of the native library for Hadoop
(for force it to fail if libsnappy is not found the require.snappy flag is
required).

 

A quick test could be this:

 

htf@german:~/hadoop/lib/native$ nm libhadoop.so  | grep -i snappy

0000000000005e10 T
Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_compressBytesDire
ct

0000000000005db0 T
Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_getLibraryName

0000000000006200 T
Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_initIDs

0000000000006450 T
Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_decompressBytes
Direct

0000000000006860 T
Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_initIDs

000000000000d030 T
Java_org_apache_hadoop_util_NativeCodeLoader_buildSupportsSnappy

0000000000215958 b SnappyCompressor_clazz

0000000000215970 b SnappyCompressor_compressedDirectBuf

0000000000215978 b SnappyCompressor_directBufferSize

0000000000215960 b SnappyCompressor_uncompressedDirectBuf

0000000000215968 b SnappyCompressor_uncompressedDirectBufLen

0000000000215988 b SnappyDecompressor_clazz

0000000000215990 b SnappyDecompressor_compressedDirectBuf

0000000000215998 b SnappyDecompressor_compressedDirectBufLen

00000000002159a8 b SnappyDecompressor_directBufferSize

00000000002159a0 b SnappyDecompressor_uncompressedDirectBuf

0000000000215980 b dlsym_snappy_compress

00000000002159b0 b dlsym_snappy_uncompress

 

If you don't see any snappy-related objects in the library, then it hasn't
been compiled with Snappy support. 

 

Note the info I give you is based on recent Hadoop releases (like 2.2.0),
but something similar should apply for your release.

 

Regards

.g

 

From: bharath vissapragada [mailto:bharathvissapragada1990@gmail.com] 
Sent: Thursday, January 02, 2014 5:56 AM
To: User
Subject: Re: Setting up Snappy compression in Hadoop

 

Your natives should be in LD_LIBRARY_PATH or java.library.path for hadoop to
pick them up. You can try adding export HADOOP_OPTS=$HADOOP_OPTS
-Djava.library.path=<Path to your natives lib> to hadoop-env.sh in TTs and
clients/gateways and restart TTs and give it another try. The reason its
working for Hbase is you are manually pointing HBASE_LIBRARY_PATH to the
natives. My guess is they are in a wrong location.

 

On Thu, Jan 2, 2014 at 5:07 PM, Amit Sela <am...@infolinks.com> wrote:

I did everything mentioned in the link Ted mentioned, and the test actually
works, but using Snappy for MapReduce map output compression still fails
with "native snappy library not available".

 

On Wed, Jan 1, 2014 at 6:37 PM, bharath vissapragada
<bh...@gmail.com> wrote:

Did you build it for your platform? You can do an "ldd" on the .so file to
check if the dependent libs are present. Also make sure you placed it in the
right directory for your platform (Linux-amd64-64 or Linux-i386-32)

 

On Wed, Jan 1, 2014 at 10:02 PM, Ted Yu <yu...@gmail.com> wrote:

Please take a look at http://hbase.apache.org/book.html#snappy.compression

 

Cheers

 

On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:

Hi all, 

 

I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
compression.

I'm adding the configurations:

 

configuration.setBoolean("mapred.compress.map.output", true);

configuration.set("mapred.map.output.compression.codec",
"org.apache.hadoop.io.compress.SnappyCodec");

 

And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/

 

Still, all map tasks fail with "native snappy library not available".

 

Could anyone elaborate on how to install Snappy for Hadoop ?

 

Thanks, 

 

Amit.

 

 

 

 


RE: Setting up Snappy compression in Hadoop

Posted by German Florez-Larrahondo <ge...@samsung.com>.
Amit

You may also need to check whether the native library you are using includes
Snappy or not. 

 

For example, when you compile from source and the libsnappy.so is not found
then snappy support is not included as part of the native library for Hadoop
(for force it to fail if libsnappy is not found the require.snappy flag is
required).

 

A quick test could be this:

 

htf@german:~/hadoop/lib/native$ nm libhadoop.so  | grep -i snappy

0000000000005e10 T
Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_compressBytesDire
ct

0000000000005db0 T
Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_getLibraryName

0000000000006200 T
Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_initIDs

0000000000006450 T
Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_decompressBytes
Direct

0000000000006860 T
Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_initIDs

000000000000d030 T
Java_org_apache_hadoop_util_NativeCodeLoader_buildSupportsSnappy

0000000000215958 b SnappyCompressor_clazz

0000000000215970 b SnappyCompressor_compressedDirectBuf

0000000000215978 b SnappyCompressor_directBufferSize

0000000000215960 b SnappyCompressor_uncompressedDirectBuf

0000000000215968 b SnappyCompressor_uncompressedDirectBufLen

0000000000215988 b SnappyDecompressor_clazz

0000000000215990 b SnappyDecompressor_compressedDirectBuf

0000000000215998 b SnappyDecompressor_compressedDirectBufLen

00000000002159a8 b SnappyDecompressor_directBufferSize

00000000002159a0 b SnappyDecompressor_uncompressedDirectBuf

0000000000215980 b dlsym_snappy_compress

00000000002159b0 b dlsym_snappy_uncompress

 

If you don't see any snappy-related objects in the library, then it hasn't
been compiled with Snappy support. 

 

Note the info I give you is based on recent Hadoop releases (like 2.2.0),
but something similar should apply for your release.

 

Regards

.g

 

From: bharath vissapragada [mailto:bharathvissapragada1990@gmail.com] 
Sent: Thursday, January 02, 2014 5:56 AM
To: User
Subject: Re: Setting up Snappy compression in Hadoop

 

Your natives should be in LD_LIBRARY_PATH or java.library.path for hadoop to
pick them up. You can try adding export HADOOP_OPTS=$HADOOP_OPTS
-Djava.library.path=<Path to your natives lib> to hadoop-env.sh in TTs and
clients/gateways and restart TTs and give it another try. The reason its
working for Hbase is you are manually pointing HBASE_LIBRARY_PATH to the
natives. My guess is they are in a wrong location.

 

On Thu, Jan 2, 2014 at 5:07 PM, Amit Sela <am...@infolinks.com> wrote:

I did everything mentioned in the link Ted mentioned, and the test actually
works, but using Snappy for MapReduce map output compression still fails
with "native snappy library not available".

 

On Wed, Jan 1, 2014 at 6:37 PM, bharath vissapragada
<bh...@gmail.com> wrote:

Did you build it for your platform? You can do an "ldd" on the .so file to
check if the dependent libs are present. Also make sure you placed it in the
right directory for your platform (Linux-amd64-64 or Linux-i386-32)

 

On Wed, Jan 1, 2014 at 10:02 PM, Ted Yu <yu...@gmail.com> wrote:

Please take a look at http://hbase.apache.org/book.html#snappy.compression

 

Cheers

 

On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:

Hi all, 

 

I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
compression.

I'm adding the configurations:

 

configuration.setBoolean("mapred.compress.map.output", true);

configuration.set("mapred.map.output.compression.codec",
"org.apache.hadoop.io.compress.SnappyCodec");

 

And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/

 

Still, all map tasks fail with "native snappy library not available".

 

Could anyone elaborate on how to install Snappy for Hadoop ?

 

Thanks, 

 

Amit.

 

 

 

 


Re: Setting up Snappy compression in Hadoop

Posted by bharath vissapragada <bh...@gmail.com>.
Your natives should be in LD_LIBRARY_PATH or java.library.path for hadoop
to pick them up. You can try adding export HADOOP_OPTS=$HADOOP_OPTS
-Djava.library.path=<Path to your natives lib> to hadoop-env.sh in TTs and
clients/gateways and restart TTs and give it another try. The reason its
working for Hbase is you are manually pointing HBASE_LIBRARY_PATH to the
natives. My guess is they are in a wrong location.


On Thu, Jan 2, 2014 at 5:07 PM, Amit Sela <am...@infolinks.com> wrote:

> I did everything mentioned in the link Ted mentioned, and the test
> actually works, but using Snappy for MapReduce map output compression still
> fails with "native snappy library not available".
>
>
> On Wed, Jan 1, 2014 at 6:37 PM, bharath vissapragada <
> bharathvissapragada1990@gmail.com> wrote:
>
>> Did you build it for your platform? You can do an "ldd" on the .so file
>> to check if the dependent libs are present. Also make sure you placed it in
>> the right directory for your platform (Linux-amd64-64 or Linux-i386-32)
>>
>>
>> On Wed, Jan 1, 2014 at 10:02 PM, Ted Yu <yu...@gmail.com> wrote:
>>
>>> Please take a look at
>>> http://hbase.apache.org/book.html#snappy.compression
>>>
>>> Cheers
>>>
>>>
>>> On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:
>>>
>>>> Hi all,
>>>>
>>>> I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
>>>> compression.
>>>> I'm adding the configurations:
>>>>
>>>> configuration.setBoolean("mapred.compress.map.output", true);
>>>> configuration.set("mapred.map.output.compression.codec",
>>>> "org.apache.hadoop.io.compress.SnappyCodec");
>>>>
>>>> And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/
>>>>
>>>> Still, all map tasks fail with "native snappy library not available".
>>>>
>>>> Could anyone elaborate on how to install Snappy for Hadoop ?
>>>>
>>>> Thanks,
>>>>
>>>> Amit.
>>>>
>>>
>>>
>>
>

Re: Setting up Snappy compression in Hadoop

Posted by bharath vissapragada <bh...@gmail.com>.
Your natives should be in LD_LIBRARY_PATH or java.library.path for hadoop
to pick them up. You can try adding export HADOOP_OPTS=$HADOOP_OPTS
-Djava.library.path=<Path to your natives lib> to hadoop-env.sh in TTs and
clients/gateways and restart TTs and give it another try. The reason its
working for Hbase is you are manually pointing HBASE_LIBRARY_PATH to the
natives. My guess is they are in a wrong location.


On Thu, Jan 2, 2014 at 5:07 PM, Amit Sela <am...@infolinks.com> wrote:

> I did everything mentioned in the link Ted mentioned, and the test
> actually works, but using Snappy for MapReduce map output compression still
> fails with "native snappy library not available".
>
>
> On Wed, Jan 1, 2014 at 6:37 PM, bharath vissapragada <
> bharathvissapragada1990@gmail.com> wrote:
>
>> Did you build it for your platform? You can do an "ldd" on the .so file
>> to check if the dependent libs are present. Also make sure you placed it in
>> the right directory for your platform (Linux-amd64-64 or Linux-i386-32)
>>
>>
>> On Wed, Jan 1, 2014 at 10:02 PM, Ted Yu <yu...@gmail.com> wrote:
>>
>>> Please take a look at
>>> http://hbase.apache.org/book.html#snappy.compression
>>>
>>> Cheers
>>>
>>>
>>> On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:
>>>
>>>> Hi all,
>>>>
>>>> I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
>>>> compression.
>>>> I'm adding the configurations:
>>>>
>>>> configuration.setBoolean("mapred.compress.map.output", true);
>>>> configuration.set("mapred.map.output.compression.codec",
>>>> "org.apache.hadoop.io.compress.SnappyCodec");
>>>>
>>>> And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/
>>>>
>>>> Still, all map tasks fail with "native snappy library not available".
>>>>
>>>> Could anyone elaborate on how to install Snappy for Hadoop ?
>>>>
>>>> Thanks,
>>>>
>>>> Amit.
>>>>
>>>
>>>
>>
>

Re: Setting up Snappy compression in Hadoop

Posted by bharath vissapragada <bh...@gmail.com>.
Your natives should be in LD_LIBRARY_PATH or java.library.path for hadoop
to pick them up. You can try adding export HADOOP_OPTS=$HADOOP_OPTS
-Djava.library.path=<Path to your natives lib> to hadoop-env.sh in TTs and
clients/gateways and restart TTs and give it another try. The reason its
working for Hbase is you are manually pointing HBASE_LIBRARY_PATH to the
natives. My guess is they are in a wrong location.


On Thu, Jan 2, 2014 at 5:07 PM, Amit Sela <am...@infolinks.com> wrote:

> I did everything mentioned in the link Ted mentioned, and the test
> actually works, but using Snappy for MapReduce map output compression still
> fails with "native snappy library not available".
>
>
> On Wed, Jan 1, 2014 at 6:37 PM, bharath vissapragada <
> bharathvissapragada1990@gmail.com> wrote:
>
>> Did you build it for your platform? You can do an "ldd" on the .so file
>> to check if the dependent libs are present. Also make sure you placed it in
>> the right directory for your platform (Linux-amd64-64 or Linux-i386-32)
>>
>>
>> On Wed, Jan 1, 2014 at 10:02 PM, Ted Yu <yu...@gmail.com> wrote:
>>
>>> Please take a look at
>>> http://hbase.apache.org/book.html#snappy.compression
>>>
>>> Cheers
>>>
>>>
>>> On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:
>>>
>>>> Hi all,
>>>>
>>>> I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
>>>> compression.
>>>> I'm adding the configurations:
>>>>
>>>> configuration.setBoolean("mapred.compress.map.output", true);
>>>> configuration.set("mapred.map.output.compression.codec",
>>>> "org.apache.hadoop.io.compress.SnappyCodec");
>>>>
>>>> And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/
>>>>
>>>> Still, all map tasks fail with "native snappy library not available".
>>>>
>>>> Could anyone elaborate on how to install Snappy for Hadoop ?
>>>>
>>>> Thanks,
>>>>
>>>> Amit.
>>>>
>>>
>>>
>>
>

Re: Setting up Snappy compression in Hadoop

Posted by bharath vissapragada <bh...@gmail.com>.
Your natives should be in LD_LIBRARY_PATH or java.library.path for hadoop
to pick them up. You can try adding export HADOOP_OPTS=$HADOOP_OPTS
-Djava.library.path=<Path to your natives lib> to hadoop-env.sh in TTs and
clients/gateways and restart TTs and give it another try. The reason its
working for Hbase is you are manually pointing HBASE_LIBRARY_PATH to the
natives. My guess is they are in a wrong location.


On Thu, Jan 2, 2014 at 5:07 PM, Amit Sela <am...@infolinks.com> wrote:

> I did everything mentioned in the link Ted mentioned, and the test
> actually works, but using Snappy for MapReduce map output compression still
> fails with "native snappy library not available".
>
>
> On Wed, Jan 1, 2014 at 6:37 PM, bharath vissapragada <
> bharathvissapragada1990@gmail.com> wrote:
>
>> Did you build it for your platform? You can do an "ldd" on the .so file
>> to check if the dependent libs are present. Also make sure you placed it in
>> the right directory for your platform (Linux-amd64-64 or Linux-i386-32)
>>
>>
>> On Wed, Jan 1, 2014 at 10:02 PM, Ted Yu <yu...@gmail.com> wrote:
>>
>>> Please take a look at
>>> http://hbase.apache.org/book.html#snappy.compression
>>>
>>> Cheers
>>>
>>>
>>> On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:
>>>
>>>> Hi all,
>>>>
>>>> I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
>>>> compression.
>>>> I'm adding the configurations:
>>>>
>>>> configuration.setBoolean("mapred.compress.map.output", true);
>>>> configuration.set("mapred.map.output.compression.codec",
>>>> "org.apache.hadoop.io.compress.SnappyCodec");
>>>>
>>>> And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/
>>>>
>>>> Still, all map tasks fail with "native snappy library not available".
>>>>
>>>> Could anyone elaborate on how to install Snappy for Hadoop ?
>>>>
>>>> Thanks,
>>>>
>>>> Amit.
>>>>
>>>
>>>
>>
>

RE: Setting up Snappy compression in Hadoop

Posted by java8964 <ja...@hotmail.com>.
If you really confirmed that libsnappy.so.1 is in the correct location, and being loaded into java library path, working in your test program, but still didn't work in MR, there is one another possibility which was puzzling me before.
How do you get the libhadoop.so in your hadoop environment? Did you compile it by yourself, or is it  from some vendors?
You want to make sure the Java native methods of invoking snappy is also being compiled and available in libhadoop.so.
For example, the following command will prove it:
$ nm ./libhadoop.so | grep snappy00000000000035c0 T Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_compressBytesDirect0000000000003960 T Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_initIDs0000000000003bb0 T Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_decompressBytesDirect0000000000003f60 T Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_initIDs0000000000206cf0 b dlsym_snappy_compress0000000000206d20 b dlsym_snappy_uncompress
Without these Java native methods being compiled and available in the libhadoop.so, MR runtime will also complain that "native snappy library not available".
Yong
Date: Thu, 2 Jan 2014 13:37:46 +0200
Subject: Re: Setting up Snappy compression in Hadoop
From: amits@infolinks.com
To: user@hadoop.apache.org

I did everything mentioned in the link Ted mentioned, and the test actually works, but using Snappy for MapReduce map output compression still fails with "native snappy library not available".


On Wed, Jan 1, 2014 at 6:37 PM, bharath vissapragada <bh...@gmail.com> wrote:

Did you build it for your platform? You can do an "ldd" on the .so file to check if the dependent libs are present. Also make sure you placed it in the right directory for your platform (Linux-amd64-64 or Linux-i386-32)




On Wed, Jan 1, 2014 at 10:02 PM, Ted Yu <yu...@gmail.com> wrote:



Please take a look at http://hbase.apache.org/book.html#snappy.compression
Cheers




On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:




Hi all, 
I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output compression.I'm adding the configurations:
configuration.setBoolean("mapred.compress.map.output", true);





configuration.set("mapred.map.output.compression.codec", "org.apache.hadoop.io.compress.SnappyCodec");

And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/





Still, all map tasks fail with "native snappy library not available".
Could anyone elaborate on how to install Snappy for Hadoop ?





Thanks, 
Amit.





 		 	   		  

RE: Setting up Snappy compression in Hadoop

Posted by java8964 <ja...@hotmail.com>.
If you really confirmed that libsnappy.so.1 is in the correct location, and being loaded into java library path, working in your test program, but still didn't work in MR, there is one another possibility which was puzzling me before.
How do you get the libhadoop.so in your hadoop environment? Did you compile it by yourself, or is it  from some vendors?
You want to make sure the Java native methods of invoking snappy is also being compiled and available in libhadoop.so.
For example, the following command will prove it:
$ nm ./libhadoop.so | grep snappy00000000000035c0 T Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_compressBytesDirect0000000000003960 T Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_initIDs0000000000003bb0 T Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_decompressBytesDirect0000000000003f60 T Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_initIDs0000000000206cf0 b dlsym_snappy_compress0000000000206d20 b dlsym_snappy_uncompress
Without these Java native methods being compiled and available in the libhadoop.so, MR runtime will also complain that "native snappy library not available".
Yong
Date: Thu, 2 Jan 2014 13:37:46 +0200
Subject: Re: Setting up Snappy compression in Hadoop
From: amits@infolinks.com
To: user@hadoop.apache.org

I did everything mentioned in the link Ted mentioned, and the test actually works, but using Snappy for MapReduce map output compression still fails with "native snappy library not available".


On Wed, Jan 1, 2014 at 6:37 PM, bharath vissapragada <bh...@gmail.com> wrote:

Did you build it for your platform? You can do an "ldd" on the .so file to check if the dependent libs are present. Also make sure you placed it in the right directory for your platform (Linux-amd64-64 or Linux-i386-32)




On Wed, Jan 1, 2014 at 10:02 PM, Ted Yu <yu...@gmail.com> wrote:



Please take a look at http://hbase.apache.org/book.html#snappy.compression
Cheers




On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:




Hi all, 
I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output compression.I'm adding the configurations:
configuration.setBoolean("mapred.compress.map.output", true);





configuration.set("mapred.map.output.compression.codec", "org.apache.hadoop.io.compress.SnappyCodec");

And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/





Still, all map tasks fail with "native snappy library not available".
Could anyone elaborate on how to install Snappy for Hadoop ?





Thanks, 
Amit.





 		 	   		  

RE: Setting up Snappy compression in Hadoop

Posted by java8964 <ja...@hotmail.com>.
If you really confirmed that libsnappy.so.1 is in the correct location, and being loaded into java library path, working in your test program, but still didn't work in MR, there is one another possibility which was puzzling me before.
How do you get the libhadoop.so in your hadoop environment? Did you compile it by yourself, or is it  from some vendors?
You want to make sure the Java native methods of invoking snappy is also being compiled and available in libhadoop.so.
For example, the following command will prove it:
$ nm ./libhadoop.so | grep snappy00000000000035c0 T Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_compressBytesDirect0000000000003960 T Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_initIDs0000000000003bb0 T Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_decompressBytesDirect0000000000003f60 T Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_initIDs0000000000206cf0 b dlsym_snappy_compress0000000000206d20 b dlsym_snappy_uncompress
Without these Java native methods being compiled and available in the libhadoop.so, MR runtime will also complain that "native snappy library not available".
Yong
Date: Thu, 2 Jan 2014 13:37:46 +0200
Subject: Re: Setting up Snappy compression in Hadoop
From: amits@infolinks.com
To: user@hadoop.apache.org

I did everything mentioned in the link Ted mentioned, and the test actually works, but using Snappy for MapReduce map output compression still fails with "native snappy library not available".


On Wed, Jan 1, 2014 at 6:37 PM, bharath vissapragada <bh...@gmail.com> wrote:

Did you build it for your platform? You can do an "ldd" on the .so file to check if the dependent libs are present. Also make sure you placed it in the right directory for your platform (Linux-amd64-64 or Linux-i386-32)




On Wed, Jan 1, 2014 at 10:02 PM, Ted Yu <yu...@gmail.com> wrote:



Please take a look at http://hbase.apache.org/book.html#snappy.compression
Cheers




On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:




Hi all, 
I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output compression.I'm adding the configurations:
configuration.setBoolean("mapred.compress.map.output", true);





configuration.set("mapred.map.output.compression.codec", "org.apache.hadoop.io.compress.SnappyCodec");

And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/





Still, all map tasks fail with "native snappy library not available".
Could anyone elaborate on how to install Snappy for Hadoop ?





Thanks, 
Amit.





 		 	   		  

RE: Setting up Snappy compression in Hadoop

Posted by java8964 <ja...@hotmail.com>.
If you really confirmed that libsnappy.so.1 is in the correct location, and being loaded into java library path, working in your test program, but still didn't work in MR, there is one another possibility which was puzzling me before.
How do you get the libhadoop.so in your hadoop environment? Did you compile it by yourself, or is it  from some vendors?
You want to make sure the Java native methods of invoking snappy is also being compiled and available in libhadoop.so.
For example, the following command will prove it:
$ nm ./libhadoop.so | grep snappy00000000000035c0 T Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_compressBytesDirect0000000000003960 T Java_org_apache_hadoop_io_compress_snappy_SnappyCompressor_initIDs0000000000003bb0 T Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_decompressBytesDirect0000000000003f60 T Java_org_apache_hadoop_io_compress_snappy_SnappyDecompressor_initIDs0000000000206cf0 b dlsym_snappy_compress0000000000206d20 b dlsym_snappy_uncompress
Without these Java native methods being compiled and available in the libhadoop.so, MR runtime will also complain that "native snappy library not available".
Yong
Date: Thu, 2 Jan 2014 13:37:46 +0200
Subject: Re: Setting up Snappy compression in Hadoop
From: amits@infolinks.com
To: user@hadoop.apache.org

I did everything mentioned in the link Ted mentioned, and the test actually works, but using Snappy for MapReduce map output compression still fails with "native snappy library not available".


On Wed, Jan 1, 2014 at 6:37 PM, bharath vissapragada <bh...@gmail.com> wrote:

Did you build it for your platform? You can do an "ldd" on the .so file to check if the dependent libs are present. Also make sure you placed it in the right directory for your platform (Linux-amd64-64 or Linux-i386-32)




On Wed, Jan 1, 2014 at 10:02 PM, Ted Yu <yu...@gmail.com> wrote:



Please take a look at http://hbase.apache.org/book.html#snappy.compression
Cheers




On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:




Hi all, 
I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output compression.I'm adding the configurations:
configuration.setBoolean("mapred.compress.map.output", true);





configuration.set("mapred.map.output.compression.codec", "org.apache.hadoop.io.compress.SnappyCodec");

And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/





Still, all map tasks fail with "native snappy library not available".
Could anyone elaborate on how to install Snappy for Hadoop ?





Thanks, 
Amit.





 		 	   		  

Re: Setting up Snappy compression in Hadoop

Posted by Amit Sela <am...@infolinks.com>.
I did everything mentioned in the link Ted mentioned, and the test actually
works, but using Snappy for MapReduce map output compression still fails
with "native snappy library not available".


On Wed, Jan 1, 2014 at 6:37 PM, bharath vissapragada <
bharathvissapragada1990@gmail.com> wrote:

> Did you build it for your platform? You can do an "ldd" on the .so file to
> check if the dependent libs are present. Also make sure you placed it in
> the right directory for your platform (Linux-amd64-64 or Linux-i386-32)
>
>
> On Wed, Jan 1, 2014 at 10:02 PM, Ted Yu <yu...@gmail.com> wrote:
>
>> Please take a look at
>> http://hbase.apache.org/book.html#snappy.compression
>>
>> Cheers
>>
>>
>> On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:
>>
>>> Hi all,
>>>
>>> I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
>>> compression.
>>> I'm adding the configurations:
>>>
>>> configuration.setBoolean("mapred.compress.map.output", true);
>>> configuration.set("mapred.map.output.compression.codec",
>>> "org.apache.hadoop.io.compress.SnappyCodec");
>>>
>>> And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/
>>>
>>> Still, all map tasks fail with "native snappy library not available".
>>>
>>> Could anyone elaborate on how to install Snappy for Hadoop ?
>>>
>>> Thanks,
>>>
>>> Amit.
>>>
>>
>>
>

Re: Setting up Snappy compression in Hadoop

Posted by Amit Sela <am...@infolinks.com>.
I did everything mentioned in the link Ted mentioned, and the test actually
works, but using Snappy for MapReduce map output compression still fails
with "native snappy library not available".


On Wed, Jan 1, 2014 at 6:37 PM, bharath vissapragada <
bharathvissapragada1990@gmail.com> wrote:

> Did you build it for your platform? You can do an "ldd" on the .so file to
> check if the dependent libs are present. Also make sure you placed it in
> the right directory for your platform (Linux-amd64-64 or Linux-i386-32)
>
>
> On Wed, Jan 1, 2014 at 10:02 PM, Ted Yu <yu...@gmail.com> wrote:
>
>> Please take a look at
>> http://hbase.apache.org/book.html#snappy.compression
>>
>> Cheers
>>
>>
>> On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:
>>
>>> Hi all,
>>>
>>> I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
>>> compression.
>>> I'm adding the configurations:
>>>
>>> configuration.setBoolean("mapred.compress.map.output", true);
>>> configuration.set("mapred.map.output.compression.codec",
>>> "org.apache.hadoop.io.compress.SnappyCodec");
>>>
>>> And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/
>>>
>>> Still, all map tasks fail with "native snappy library not available".
>>>
>>> Could anyone elaborate on how to install Snappy for Hadoop ?
>>>
>>> Thanks,
>>>
>>> Amit.
>>>
>>
>>
>

Re: Setting up Snappy compression in Hadoop

Posted by Amit Sela <am...@infolinks.com>.
I did everything mentioned in the link Ted mentioned, and the test actually
works, but using Snappy for MapReduce map output compression still fails
with "native snappy library not available".


On Wed, Jan 1, 2014 at 6:37 PM, bharath vissapragada <
bharathvissapragada1990@gmail.com> wrote:

> Did you build it for your platform? You can do an "ldd" on the .so file to
> check if the dependent libs are present. Also make sure you placed it in
> the right directory for your platform (Linux-amd64-64 or Linux-i386-32)
>
>
> On Wed, Jan 1, 2014 at 10:02 PM, Ted Yu <yu...@gmail.com> wrote:
>
>> Please take a look at
>> http://hbase.apache.org/book.html#snappy.compression
>>
>> Cheers
>>
>>
>> On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:
>>
>>> Hi all,
>>>
>>> I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
>>> compression.
>>> I'm adding the configurations:
>>>
>>> configuration.setBoolean("mapred.compress.map.output", true);
>>> configuration.set("mapred.map.output.compression.codec",
>>> "org.apache.hadoop.io.compress.SnappyCodec");
>>>
>>> And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/
>>>
>>> Still, all map tasks fail with "native snappy library not available".
>>>
>>> Could anyone elaborate on how to install Snappy for Hadoop ?
>>>
>>> Thanks,
>>>
>>> Amit.
>>>
>>
>>
>

Re: Setting up Snappy compression in Hadoop

Posted by Amit Sela <am...@infolinks.com>.
I did everything mentioned in the link Ted mentioned, and the test actually
works, but using Snappy for MapReduce map output compression still fails
with "native snappy library not available".


On Wed, Jan 1, 2014 at 6:37 PM, bharath vissapragada <
bharathvissapragada1990@gmail.com> wrote:

> Did you build it for your platform? You can do an "ldd" on the .so file to
> check if the dependent libs are present. Also make sure you placed it in
> the right directory for your platform (Linux-amd64-64 or Linux-i386-32)
>
>
> On Wed, Jan 1, 2014 at 10:02 PM, Ted Yu <yu...@gmail.com> wrote:
>
>> Please take a look at
>> http://hbase.apache.org/book.html#snappy.compression
>>
>> Cheers
>>
>>
>> On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:
>>
>>> Hi all,
>>>
>>> I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
>>> compression.
>>> I'm adding the configurations:
>>>
>>> configuration.setBoolean("mapred.compress.map.output", true);
>>> configuration.set("mapred.map.output.compression.codec",
>>> "org.apache.hadoop.io.compress.SnappyCodec");
>>>
>>> And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/
>>>
>>> Still, all map tasks fail with "native snappy library not available".
>>>
>>> Could anyone elaborate on how to install Snappy for Hadoop ?
>>>
>>> Thanks,
>>>
>>> Amit.
>>>
>>
>>
>

Re: Setting up Snappy compression in Hadoop

Posted by bharath vissapragada <bh...@gmail.com>.
Did you build it for your platform? You can do an "ldd" on the .so file to
check if the dependent libs are present. Also make sure you placed it in
the right directory for your platform (Linux-amd64-64 or Linux-i386-32)


On Wed, Jan 1, 2014 at 10:02 PM, Ted Yu <yu...@gmail.com> wrote:

> Please take a look at http://hbase.apache.org/book.html#snappy.compression
>
> Cheers
>
>
> On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:
>
>> Hi all,
>>
>> I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
>> compression.
>> I'm adding the configurations:
>>
>> configuration.setBoolean("mapred.compress.map.output", true);
>> configuration.set("mapred.map.output.compression.codec",
>> "org.apache.hadoop.io.compress.SnappyCodec");
>>
>> And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/
>>
>> Still, all map tasks fail with "native snappy library not available".
>>
>> Could anyone elaborate on how to install Snappy for Hadoop ?
>>
>> Thanks,
>>
>> Amit.
>>
>
>

Re: Setting up Snappy compression in Hadoop

Posted by bharath vissapragada <bh...@gmail.com>.
Did you build it for your platform? You can do an "ldd" on the .so file to
check if the dependent libs are present. Also make sure you placed it in
the right directory for your platform (Linux-amd64-64 or Linux-i386-32)


On Wed, Jan 1, 2014 at 10:02 PM, Ted Yu <yu...@gmail.com> wrote:

> Please take a look at http://hbase.apache.org/book.html#snappy.compression
>
> Cheers
>
>
> On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:
>
>> Hi all,
>>
>> I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
>> compression.
>> I'm adding the configurations:
>>
>> configuration.setBoolean("mapred.compress.map.output", true);
>> configuration.set("mapred.map.output.compression.codec",
>> "org.apache.hadoop.io.compress.SnappyCodec");
>>
>> And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/
>>
>> Still, all map tasks fail with "native snappy library not available".
>>
>> Could anyone elaborate on how to install Snappy for Hadoop ?
>>
>> Thanks,
>>
>> Amit.
>>
>
>

Re: Setting up Snappy compression in Hadoop

Posted by bharath vissapragada <bh...@gmail.com>.
Did you build it for your platform? You can do an "ldd" on the .so file to
check if the dependent libs are present. Also make sure you placed it in
the right directory for your platform (Linux-amd64-64 or Linux-i386-32)


On Wed, Jan 1, 2014 at 10:02 PM, Ted Yu <yu...@gmail.com> wrote:

> Please take a look at http://hbase.apache.org/book.html#snappy.compression
>
> Cheers
>
>
> On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:
>
>> Hi all,
>>
>> I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
>> compression.
>> I'm adding the configurations:
>>
>> configuration.setBoolean("mapred.compress.map.output", true);
>> configuration.set("mapred.map.output.compression.codec",
>> "org.apache.hadoop.io.compress.SnappyCodec");
>>
>> And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/
>>
>> Still, all map tasks fail with "native snappy library not available".
>>
>> Could anyone elaborate on how to install Snappy for Hadoop ?
>>
>> Thanks,
>>
>> Amit.
>>
>
>

Re: Setting up Snappy compression in Hadoop

Posted by bharath vissapragada <bh...@gmail.com>.
Did you build it for your platform? You can do an "ldd" on the .so file to
check if the dependent libs are present. Also make sure you placed it in
the right directory for your platform (Linux-amd64-64 or Linux-i386-32)


On Wed, Jan 1, 2014 at 10:02 PM, Ted Yu <yu...@gmail.com> wrote:

> Please take a look at http://hbase.apache.org/book.html#snappy.compression
>
> Cheers
>
>
> On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:
>
>> Hi all,
>>
>> I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
>> compression.
>> I'm adding the configurations:
>>
>> configuration.setBoolean("mapred.compress.map.output", true);
>> configuration.set("mapred.map.output.compression.codec",
>> "org.apache.hadoop.io.compress.SnappyCodec");
>>
>> And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/
>>
>> Still, all map tasks fail with "native snappy library not available".
>>
>> Could anyone elaborate on how to install Snappy for Hadoop ?
>>
>> Thanks,
>>
>> Amit.
>>
>
>

Re: Setting up Snappy compression in Hadoop

Posted by Ted Yu <yu...@gmail.com>.
Please take a look at http://hbase.apache.org/book.html#snappy.compression

Cheers


On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:

> Hi all,
>
> I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
> compression.
> I'm adding the configurations:
>
> configuration.setBoolean("mapred.compress.map.output", true);
> configuration.set("mapred.map.output.compression.codec",
> "org.apache.hadoop.io.compress.SnappyCodec");
>
> And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/
>
> Still, all map tasks fail with "native snappy library not available".
>
> Could anyone elaborate on how to install Snappy for Hadoop ?
>
> Thanks,
>
> Amit.
>

Re: Setting up Snappy compression in Hadoop

Posted by Ted Yu <yu...@gmail.com>.
Please take a look at http://hbase.apache.org/book.html#snappy.compression

Cheers


On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:

> Hi all,
>
> I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
> compression.
> I'm adding the configurations:
>
> configuration.setBoolean("mapred.compress.map.output", true);
> configuration.set("mapred.map.output.compression.codec",
> "org.apache.hadoop.io.compress.SnappyCodec");
>
> And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/
>
> Still, all map tasks fail with "native snappy library not available".
>
> Could anyone elaborate on how to install Snappy for Hadoop ?
>
> Thanks,
>
> Amit.
>

Re: Setting up Snappy compression in Hadoop

Posted by Ted Yu <yu...@gmail.com>.
Please take a look at http://hbase.apache.org/book.html#snappy.compression

Cheers


On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:

> Hi all,
>
> I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
> compression.
> I'm adding the configurations:
>
> configuration.setBoolean("mapred.compress.map.output", true);
> configuration.set("mapred.map.output.compression.codec",
> "org.apache.hadoop.io.compress.SnappyCodec");
>
> And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/
>
> Still, all map tasks fail with "native snappy library not available".
>
> Could anyone elaborate on how to install Snappy for Hadoop ?
>
> Thanks,
>
> Amit.
>

Re: Setting up Snappy compression in Hadoop

Posted by Ted Yu <yu...@gmail.com>.
Please take a look at http://hbase.apache.org/book.html#snappy.compression

Cheers


On Wed, Jan 1, 2014 at 8:05 AM, Amit Sela <am...@infolinks.com> wrote:

> Hi all,
>
> I'm running on Hadoop 1.0.4 and I'd like to use Snappy for map output
> compression.
> I'm adding the configurations:
>
> configuration.setBoolean("mapred.compress.map.output", true);
> configuration.set("mapred.map.output.compression.codec",
> "org.apache.hadoop.io.compress.SnappyCodec");
>
> And I've added libsnappy.so.1 to $HADOOP_HOME/lib/native/Linux-amd64-64/
>
> Still, all map tasks fail with "native snappy library not available".
>
> Could anyone elaborate on how to install Snappy for Hadoop ?
>
> Thanks,
>
> Amit.
>