You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by 王鹏飞 <wp...@gmail.com> on 2015/03/24 10:18:36 UTC
Something about the Snappy Compression Tool
I noticed a map-reduce job encountered an
Exception:java.lang.UnsatisfiedLinkError:org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
I googled it and released that it was lack of snappy tool.I
Installed the snappy and copied libsnappy.so* to $HADOOP_HOME/lib/native
In .bashrc added:
export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
export JAVA_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
export
HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"
In core-site.xml added:
<property>
<name>io.compression.codecs</name>
<value>
org.apache.hadoop.io.compress.GzipCodec,
org.apache.hadoop.io.compress.DefaultCodec,
org.apache.hadoop.io.compress.BZip2Codec,
org.apache.hadoop.io.compress.SnappyCodec
</value>
</property>
In mapred-site.xml added:
<property>
<name>mapreduce.map.output.compress</name>
<value>true</value>
</property>
<property>
<name>mapred.map.output.compress.codec</name>
<value>org.apache.hadoop.io.compress.SnappyCodec</value>
</property>
<property>
<name>mapreduce.admin.user.env</name>
<value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
</property>
In yarn-site.xml added:
<property>
<name>yarn.app.mapreduce.am.env</name>
<value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
</property>
Finally make the same setting in the datanode ,but it still didn't
work 。But in a Pseudo-Distributed cluster, I only copy the libsnappy.so* to
$HADOOP_HOME/lib/native without changing the configuration ,it works well
.Why in this cluster it appears so difficult?
Re: Something about the Snappy Compression Tool
Posted by Azuryy Yu <az...@gmail.com>.
Hi,
You should compile Hadoop source code with -Drequire.snappy, such as:
mvn -DskipTests -Pnative,dist -Dtar,require.snappy package
but you also need to install snappy before your compiling..
On Wed, Mar 25, 2015 at 9:39 AM, 王鹏飞 <wp...@gmail.com> wrote:
> How to recompiling library?Cause my hadoop was built from tar ball and in
> /lib/native there exist files like *.a and *.so.Does I need build from
> source code to recompiling library?
>
> On Tue, Mar 24, 2015 at 6:35 PM, Tsuyoshi Ozawa <oz...@apache.org> wrote:
>
>> Sometimes compiled native libraries included in tar ball doesn't work
>> correctly - how about recompiling library on your environment?
>>
>> Thanks,
>> - Tsuyoshi
>>
>> On Tue, Mar 24, 2015 at 6:18 PM, 王鹏飞 <wp...@gmail.com> wrote:
>> > I noticed a map-reduce job encountered an
>> >
>> Exception:java.lang.UnsatisfiedLinkError:org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>> > I googled it and released that it was lack of snappy tool.I
>> > Installed the snappy and copied libsnappy.so* to $HADOOP_HOME/lib/native
>> > In .bashrc added:
>> > export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
>> > export JAVA_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
>> > export
>> > HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"
>> > In core-site.xml added:
>> > <property>
>> > <name>io.compression.codecs</name>
>> > <value>
>> >
>> > org.apache.hadoop.io.compress.GzipCodec,
>> >
>> > org.apache.hadoop.io.compress.DefaultCodec,
>> >
>> > org.apache.hadoop.io.compress.BZip2Codec,
>> >
>> > org.apache.hadoop.io.compress.SnappyCodec
>> > </value>
>> > </property>
>> > In mapred-site.xml added:
>> > <property>
>> >
>> <name>mapreduce.map.output.compress</name>
>> > <value>true</value>
>> > </property>
>> >
>> > <property>
>> >
>> <name>mapred.map.output.compress.codec</name>
>> >
>> > <value>org.apache.hadoop.io.compress.SnappyCodec</value>
>> > </property>
>> >
>> > <property>
>> > <name>mapreduce.admin.user.env</name>
>> >
>> >
>> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
>> > </property>
>> > In yarn-site.xml added:
>> > <property>
>> > <name>yarn.app.mapreduce.am.env</name>
>> >
>> >
>> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
>> > </property>
>> > Finally make the same setting in the datanode ,but it still
>> didn't
>> > work 。But in a Pseudo-Distributed cluster, I only copy the
>> libsnappy.so* to
>> > $HADOOP_HOME/lib/native without changing the configuration ,it works
>> well
>> > .Why in this cluster it appears so difficult?
>> >
>> >
>> >
>> >
>> >
>> >
>>
>
>
Re: Something about the Snappy Compression Tool
Posted by Azuryy Yu <az...@gmail.com>.
Hi,
You should compile Hadoop source code with -Drequire.snappy, such as:
mvn -DskipTests -Pnative,dist -Dtar,require.snappy package
but you also need to install snappy before your compiling..
On Wed, Mar 25, 2015 at 9:39 AM, 王鹏飞 <wp...@gmail.com> wrote:
> How to recompiling library?Cause my hadoop was built from tar ball and in
> /lib/native there exist files like *.a and *.so.Does I need build from
> source code to recompiling library?
>
> On Tue, Mar 24, 2015 at 6:35 PM, Tsuyoshi Ozawa <oz...@apache.org> wrote:
>
>> Sometimes compiled native libraries included in tar ball doesn't work
>> correctly - how about recompiling library on your environment?
>>
>> Thanks,
>> - Tsuyoshi
>>
>> On Tue, Mar 24, 2015 at 6:18 PM, 王鹏飞 <wp...@gmail.com> wrote:
>> > I noticed a map-reduce job encountered an
>> >
>> Exception:java.lang.UnsatisfiedLinkError:org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>> > I googled it and released that it was lack of snappy tool.I
>> > Installed the snappy and copied libsnappy.so* to $HADOOP_HOME/lib/native
>> > In .bashrc added:
>> > export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
>> > export JAVA_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
>> > export
>> > HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"
>> > In core-site.xml added:
>> > <property>
>> > <name>io.compression.codecs</name>
>> > <value>
>> >
>> > org.apache.hadoop.io.compress.GzipCodec,
>> >
>> > org.apache.hadoop.io.compress.DefaultCodec,
>> >
>> > org.apache.hadoop.io.compress.BZip2Codec,
>> >
>> > org.apache.hadoop.io.compress.SnappyCodec
>> > </value>
>> > </property>
>> > In mapred-site.xml added:
>> > <property>
>> >
>> <name>mapreduce.map.output.compress</name>
>> > <value>true</value>
>> > </property>
>> >
>> > <property>
>> >
>> <name>mapred.map.output.compress.codec</name>
>> >
>> > <value>org.apache.hadoop.io.compress.SnappyCodec</value>
>> > </property>
>> >
>> > <property>
>> > <name>mapreduce.admin.user.env</name>
>> >
>> >
>> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
>> > </property>
>> > In yarn-site.xml added:
>> > <property>
>> > <name>yarn.app.mapreduce.am.env</name>
>> >
>> >
>> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
>> > </property>
>> > Finally make the same setting in the datanode ,but it still
>> didn't
>> > work 。But in a Pseudo-Distributed cluster, I only copy the
>> libsnappy.so* to
>> > $HADOOP_HOME/lib/native without changing the configuration ,it works
>> well
>> > .Why in this cluster it appears so difficult?
>> >
>> >
>> >
>> >
>> >
>> >
>>
>
>
Re: Something about the Snappy Compression Tool
Posted by Azuryy Yu <az...@gmail.com>.
Hi,
You should compile Hadoop source code with -Drequire.snappy, such as:
mvn -DskipTests -Pnative,dist -Dtar,require.snappy package
but you also need to install snappy before your compiling..
On Wed, Mar 25, 2015 at 9:39 AM, 王鹏飞 <wp...@gmail.com> wrote:
> How to recompiling library?Cause my hadoop was built from tar ball and in
> /lib/native there exist files like *.a and *.so.Does I need build from
> source code to recompiling library?
>
> On Tue, Mar 24, 2015 at 6:35 PM, Tsuyoshi Ozawa <oz...@apache.org> wrote:
>
>> Sometimes compiled native libraries included in tar ball doesn't work
>> correctly - how about recompiling library on your environment?
>>
>> Thanks,
>> - Tsuyoshi
>>
>> On Tue, Mar 24, 2015 at 6:18 PM, 王鹏飞 <wp...@gmail.com> wrote:
>> > I noticed a map-reduce job encountered an
>> >
>> Exception:java.lang.UnsatisfiedLinkError:org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>> > I googled it and released that it was lack of snappy tool.I
>> > Installed the snappy and copied libsnappy.so* to $HADOOP_HOME/lib/native
>> > In .bashrc added:
>> > export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
>> > export JAVA_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
>> > export
>> > HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"
>> > In core-site.xml added:
>> > <property>
>> > <name>io.compression.codecs</name>
>> > <value>
>> >
>> > org.apache.hadoop.io.compress.GzipCodec,
>> >
>> > org.apache.hadoop.io.compress.DefaultCodec,
>> >
>> > org.apache.hadoop.io.compress.BZip2Codec,
>> >
>> > org.apache.hadoop.io.compress.SnappyCodec
>> > </value>
>> > </property>
>> > In mapred-site.xml added:
>> > <property>
>> >
>> <name>mapreduce.map.output.compress</name>
>> > <value>true</value>
>> > </property>
>> >
>> > <property>
>> >
>> <name>mapred.map.output.compress.codec</name>
>> >
>> > <value>org.apache.hadoop.io.compress.SnappyCodec</value>
>> > </property>
>> >
>> > <property>
>> > <name>mapreduce.admin.user.env</name>
>> >
>> >
>> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
>> > </property>
>> > In yarn-site.xml added:
>> > <property>
>> > <name>yarn.app.mapreduce.am.env</name>
>> >
>> >
>> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
>> > </property>
>> > Finally make the same setting in the datanode ,but it still
>> didn't
>> > work 。But in a Pseudo-Distributed cluster, I only copy the
>> libsnappy.so* to
>> > $HADOOP_HOME/lib/native without changing the configuration ,it works
>> well
>> > .Why in this cluster it appears so difficult?
>> >
>> >
>> >
>> >
>> >
>> >
>>
>
>
Re: Something about the Snappy Compression Tool
Posted by Azuryy Yu <az...@gmail.com>.
Hi,
You should compile Hadoop source code with -Drequire.snappy, such as:
mvn -DskipTests -Pnative,dist -Dtar,require.snappy package
but you also need to install snappy before your compiling..
On Wed, Mar 25, 2015 at 9:39 AM, 王鹏飞 <wp...@gmail.com> wrote:
> How to recompiling library?Cause my hadoop was built from tar ball and in
> /lib/native there exist files like *.a and *.so.Does I need build from
> source code to recompiling library?
>
> On Tue, Mar 24, 2015 at 6:35 PM, Tsuyoshi Ozawa <oz...@apache.org> wrote:
>
>> Sometimes compiled native libraries included in tar ball doesn't work
>> correctly - how about recompiling library on your environment?
>>
>> Thanks,
>> - Tsuyoshi
>>
>> On Tue, Mar 24, 2015 at 6:18 PM, 王鹏飞 <wp...@gmail.com> wrote:
>> > I noticed a map-reduce job encountered an
>> >
>> Exception:java.lang.UnsatisfiedLinkError:org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>> > I googled it and released that it was lack of snappy tool.I
>> > Installed the snappy and copied libsnappy.so* to $HADOOP_HOME/lib/native
>> > In .bashrc added:
>> > export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
>> > export JAVA_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
>> > export
>> > HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"
>> > In core-site.xml added:
>> > <property>
>> > <name>io.compression.codecs</name>
>> > <value>
>> >
>> > org.apache.hadoop.io.compress.GzipCodec,
>> >
>> > org.apache.hadoop.io.compress.DefaultCodec,
>> >
>> > org.apache.hadoop.io.compress.BZip2Codec,
>> >
>> > org.apache.hadoop.io.compress.SnappyCodec
>> > </value>
>> > </property>
>> > In mapred-site.xml added:
>> > <property>
>> >
>> <name>mapreduce.map.output.compress</name>
>> > <value>true</value>
>> > </property>
>> >
>> > <property>
>> >
>> <name>mapred.map.output.compress.codec</name>
>> >
>> > <value>org.apache.hadoop.io.compress.SnappyCodec</value>
>> > </property>
>> >
>> > <property>
>> > <name>mapreduce.admin.user.env</name>
>> >
>> >
>> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
>> > </property>
>> > In yarn-site.xml added:
>> > <property>
>> > <name>yarn.app.mapreduce.am.env</name>
>> >
>> >
>> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
>> > </property>
>> > Finally make the same setting in the datanode ,but it still
>> didn't
>> > work 。But in a Pseudo-Distributed cluster, I only copy the
>> libsnappy.so* to
>> > $HADOOP_HOME/lib/native without changing the configuration ,it works
>> well
>> > .Why in this cluster it appears so difficult?
>> >
>> >
>> >
>> >
>> >
>> >
>>
>
>
Re: Something about the Snappy Compression Tool
Posted by 王鹏飞 <wp...@gmail.com>.
How to recompiling library?Cause my hadoop was built from tar ball and in
/lib/native there exist files like *.a and *.so.Does I need build from
source code to recompiling library?
On Tue, Mar 24, 2015 at 6:35 PM, Tsuyoshi Ozawa <oz...@apache.org> wrote:
> Sometimes compiled native libraries included in tar ball doesn't work
> correctly - how about recompiling library on your environment?
>
> Thanks,
> - Tsuyoshi
>
> On Tue, Mar 24, 2015 at 6:18 PM, 王鹏飞 <wp...@gmail.com> wrote:
> > I noticed a map-reduce job encountered an
> >
> Exception:java.lang.UnsatisfiedLinkError:org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
> > I googled it and released that it was lack of snappy tool.I
> > Installed the snappy and copied libsnappy.so* to $HADOOP_HOME/lib/native
> > In .bashrc added:
> > export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
> > export JAVA_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
> > export
> > HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"
> > In core-site.xml added:
> > <property>
> > <name>io.compression.codecs</name>
> > <value>
> >
> > org.apache.hadoop.io.compress.GzipCodec,
> >
> > org.apache.hadoop.io.compress.DefaultCodec,
> >
> > org.apache.hadoop.io.compress.BZip2Codec,
> >
> > org.apache.hadoop.io.compress.SnappyCodec
> > </value>
> > </property>
> > In mapred-site.xml added:
> > <property>
> >
> <name>mapreduce.map.output.compress</name>
> > <value>true</value>
> > </property>
> >
> > <property>
> >
> <name>mapred.map.output.compress.codec</name>
> >
> > <value>org.apache.hadoop.io.compress.SnappyCodec</value>
> > </property>
> >
> > <property>
> > <name>mapreduce.admin.user.env</name>
> >
> >
> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
> > </property>
> > In yarn-site.xml added:
> > <property>
> > <name>yarn.app.mapreduce.am.env</name>
> >
> >
> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
> > </property>
> > Finally make the same setting in the datanode ,but it still
> didn't
> > work 。But in a Pseudo-Distributed cluster, I only copy the libsnappy.so*
> to
> > $HADOOP_HOME/lib/native without changing the configuration ,it works well
> > .Why in this cluster it appears so difficult?
> >
> >
> >
> >
> >
> >
>
Re: Something about the Snappy Compression Tool
Posted by 王鹏飞 <wp...@gmail.com>.
How to recompiling library?Cause my hadoop was built from tar ball and in
/lib/native there exist files like *.a and *.so.Does I need build from
source code to recompiling library?
On Tue, Mar 24, 2015 at 6:35 PM, Tsuyoshi Ozawa <oz...@apache.org> wrote:
> Sometimes compiled native libraries included in tar ball doesn't work
> correctly - how about recompiling library on your environment?
>
> Thanks,
> - Tsuyoshi
>
> On Tue, Mar 24, 2015 at 6:18 PM, 王鹏飞 <wp...@gmail.com> wrote:
> > I noticed a map-reduce job encountered an
> >
> Exception:java.lang.UnsatisfiedLinkError:org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
> > I googled it and released that it was lack of snappy tool.I
> > Installed the snappy and copied libsnappy.so* to $HADOOP_HOME/lib/native
> > In .bashrc added:
> > export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
> > export JAVA_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
> > export
> > HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"
> > In core-site.xml added:
> > <property>
> > <name>io.compression.codecs</name>
> > <value>
> >
> > org.apache.hadoop.io.compress.GzipCodec,
> >
> > org.apache.hadoop.io.compress.DefaultCodec,
> >
> > org.apache.hadoop.io.compress.BZip2Codec,
> >
> > org.apache.hadoop.io.compress.SnappyCodec
> > </value>
> > </property>
> > In mapred-site.xml added:
> > <property>
> >
> <name>mapreduce.map.output.compress</name>
> > <value>true</value>
> > </property>
> >
> > <property>
> >
> <name>mapred.map.output.compress.codec</name>
> >
> > <value>org.apache.hadoop.io.compress.SnappyCodec</value>
> > </property>
> >
> > <property>
> > <name>mapreduce.admin.user.env</name>
> >
> >
> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
> > </property>
> > In yarn-site.xml added:
> > <property>
> > <name>yarn.app.mapreduce.am.env</name>
> >
> >
> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
> > </property>
> > Finally make the same setting in the datanode ,but it still
> didn't
> > work 。But in a Pseudo-Distributed cluster, I only copy the libsnappy.so*
> to
> > $HADOOP_HOME/lib/native without changing the configuration ,it works well
> > .Why in this cluster it appears so difficult?
> >
> >
> >
> >
> >
> >
>
Re: Something about the Snappy Compression Tool
Posted by 王鹏飞 <wp...@gmail.com>.
How to recompiling library?Cause my hadoop was built from tar ball and in
/lib/native there exist files like *.a and *.so.Does I need build from
source code to recompiling library?
On Tue, Mar 24, 2015 at 6:35 PM, Tsuyoshi Ozawa <oz...@apache.org> wrote:
> Sometimes compiled native libraries included in tar ball doesn't work
> correctly - how about recompiling library on your environment?
>
> Thanks,
> - Tsuyoshi
>
> On Tue, Mar 24, 2015 at 6:18 PM, 王鹏飞 <wp...@gmail.com> wrote:
> > I noticed a map-reduce job encountered an
> >
> Exception:java.lang.UnsatisfiedLinkError:org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
> > I googled it and released that it was lack of snappy tool.I
> > Installed the snappy and copied libsnappy.so* to $HADOOP_HOME/lib/native
> > In .bashrc added:
> > export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
> > export JAVA_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
> > export
> > HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"
> > In core-site.xml added:
> > <property>
> > <name>io.compression.codecs</name>
> > <value>
> >
> > org.apache.hadoop.io.compress.GzipCodec,
> >
> > org.apache.hadoop.io.compress.DefaultCodec,
> >
> > org.apache.hadoop.io.compress.BZip2Codec,
> >
> > org.apache.hadoop.io.compress.SnappyCodec
> > </value>
> > </property>
> > In mapred-site.xml added:
> > <property>
> >
> <name>mapreduce.map.output.compress</name>
> > <value>true</value>
> > </property>
> >
> > <property>
> >
> <name>mapred.map.output.compress.codec</name>
> >
> > <value>org.apache.hadoop.io.compress.SnappyCodec</value>
> > </property>
> >
> > <property>
> > <name>mapreduce.admin.user.env</name>
> >
> >
> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
> > </property>
> > In yarn-site.xml added:
> > <property>
> > <name>yarn.app.mapreduce.am.env</name>
> >
> >
> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
> > </property>
> > Finally make the same setting in the datanode ,but it still
> didn't
> > work 。But in a Pseudo-Distributed cluster, I only copy the libsnappy.so*
> to
> > $HADOOP_HOME/lib/native without changing the configuration ,it works well
> > .Why in this cluster it appears so difficult?
> >
> >
> >
> >
> >
> >
>
Re: Something about the Snappy Compression Tool
Posted by 王鹏飞 <wp...@gmail.com>.
How to recompiling library?Cause my hadoop was built from tar ball and in
/lib/native there exist files like *.a and *.so.Does I need build from
source code to recompiling library?
On Tue, Mar 24, 2015 at 6:35 PM, Tsuyoshi Ozawa <oz...@apache.org> wrote:
> Sometimes compiled native libraries included in tar ball doesn't work
> correctly - how about recompiling library on your environment?
>
> Thanks,
> - Tsuyoshi
>
> On Tue, Mar 24, 2015 at 6:18 PM, 王鹏飞 <wp...@gmail.com> wrote:
> > I noticed a map-reduce job encountered an
> >
> Exception:java.lang.UnsatisfiedLinkError:org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
> > I googled it and released that it was lack of snappy tool.I
> > Installed the snappy and copied libsnappy.so* to $HADOOP_HOME/lib/native
> > In .bashrc added:
> > export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
> > export JAVA_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
> > export
> > HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"
> > In core-site.xml added:
> > <property>
> > <name>io.compression.codecs</name>
> > <value>
> >
> > org.apache.hadoop.io.compress.GzipCodec,
> >
> > org.apache.hadoop.io.compress.DefaultCodec,
> >
> > org.apache.hadoop.io.compress.BZip2Codec,
> >
> > org.apache.hadoop.io.compress.SnappyCodec
> > </value>
> > </property>
> > In mapred-site.xml added:
> > <property>
> >
> <name>mapreduce.map.output.compress</name>
> > <value>true</value>
> > </property>
> >
> > <property>
> >
> <name>mapred.map.output.compress.codec</name>
> >
> > <value>org.apache.hadoop.io.compress.SnappyCodec</value>
> > </property>
> >
> > <property>
> > <name>mapreduce.admin.user.env</name>
> >
> >
> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
> > </property>
> > In yarn-site.xml added:
> > <property>
> > <name>yarn.app.mapreduce.am.env</name>
> >
> >
> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
> > </property>
> > Finally make the same setting in the datanode ,but it still
> didn't
> > work 。But in a Pseudo-Distributed cluster, I only copy the libsnappy.so*
> to
> > $HADOOP_HOME/lib/native without changing the configuration ,it works well
> > .Why in this cluster it appears so difficult?
> >
> >
> >
> >
> >
> >
>
Re: Something about the Snappy Compression Tool
Posted by Tsuyoshi Ozawa <oz...@apache.org>.
Sometimes compiled native libraries included in tar ball doesn't work
correctly - how about recompiling library on your environment?
Thanks,
- Tsuyoshi
On Tue, Mar 24, 2015 at 6:18 PM, 王鹏飞 <wp...@gmail.com> wrote:
> I noticed a map-reduce job encountered an
> Exception:java.lang.UnsatisfiedLinkError:org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
> I googled it and released that it was lack of snappy tool.I
> Installed the snappy and copied libsnappy.so* to $HADOOP_HOME/lib/native
> In .bashrc added:
> export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
> export JAVA_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
> export
> HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"
> In core-site.xml added:
> <property>
> <name>io.compression.codecs</name>
> <value>
>
> org.apache.hadoop.io.compress.GzipCodec,
>
> org.apache.hadoop.io.compress.DefaultCodec,
>
> org.apache.hadoop.io.compress.BZip2Codec,
>
> org.apache.hadoop.io.compress.SnappyCodec
> </value>
> </property>
> In mapred-site.xml added:
> <property>
> <name>mapreduce.map.output.compress</name>
> <value>true</value>
> </property>
>
> <property>
> <name>mapred.map.output.compress.codec</name>
>
> <value>org.apache.hadoop.io.compress.SnappyCodec</value>
> </property>
>
> <property>
> <name>mapreduce.admin.user.env</name>
>
> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
> </property>
> In yarn-site.xml added:
> <property>
> <name>yarn.app.mapreduce.am.env</name>
>
> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
> </property>
> Finally make the same setting in the datanode ,but it still didn't
> work 。But in a Pseudo-Distributed cluster, I only copy the libsnappy.so* to
> $HADOOP_HOME/lib/native without changing the configuration ,it works well
> .Why in this cluster it appears so difficult?
>
>
>
>
>
>
Re: Something about the Snappy Compression Tool
Posted by Tsuyoshi Ozawa <oz...@apache.org>.
Sometimes compiled native libraries included in tar ball doesn't work
correctly - how about recompiling library on your environment?
Thanks,
- Tsuyoshi
On Tue, Mar 24, 2015 at 6:18 PM, 王鹏飞 <wp...@gmail.com> wrote:
> I noticed a map-reduce job encountered an
> Exception:java.lang.UnsatisfiedLinkError:org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
> I googled it and released that it was lack of snappy tool.I
> Installed the snappy and copied libsnappy.so* to $HADOOP_HOME/lib/native
> In .bashrc added:
> export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
> export JAVA_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
> export
> HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"
> In core-site.xml added:
> <property>
> <name>io.compression.codecs</name>
> <value>
>
> org.apache.hadoop.io.compress.GzipCodec,
>
> org.apache.hadoop.io.compress.DefaultCodec,
>
> org.apache.hadoop.io.compress.BZip2Codec,
>
> org.apache.hadoop.io.compress.SnappyCodec
> </value>
> </property>
> In mapred-site.xml added:
> <property>
> <name>mapreduce.map.output.compress</name>
> <value>true</value>
> </property>
>
> <property>
> <name>mapred.map.output.compress.codec</name>
>
> <value>org.apache.hadoop.io.compress.SnappyCodec</value>
> </property>
>
> <property>
> <name>mapreduce.admin.user.env</name>
>
> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
> </property>
> In yarn-site.xml added:
> <property>
> <name>yarn.app.mapreduce.am.env</name>
>
> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
> </property>
> Finally make the same setting in the datanode ,but it still didn't
> work 。But in a Pseudo-Distributed cluster, I only copy the libsnappy.so* to
> $HADOOP_HOME/lib/native without changing the configuration ,it works well
> .Why in this cluster it appears so difficult?
>
>
>
>
>
>
Re: Something about the Snappy Compression Tool
Posted by Tsuyoshi Ozawa <oz...@apache.org>.
Sometimes compiled native libraries included in tar ball doesn't work
correctly - how about recompiling library on your environment?
Thanks,
- Tsuyoshi
On Tue, Mar 24, 2015 at 6:18 PM, 王鹏飞 <wp...@gmail.com> wrote:
> I noticed a map-reduce job encountered an
> Exception:java.lang.UnsatisfiedLinkError:org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
> I googled it and released that it was lack of snappy tool.I
> Installed the snappy and copied libsnappy.so* to $HADOOP_HOME/lib/native
> In .bashrc added:
> export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
> export JAVA_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
> export
> HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"
> In core-site.xml added:
> <property>
> <name>io.compression.codecs</name>
> <value>
>
> org.apache.hadoop.io.compress.GzipCodec,
>
> org.apache.hadoop.io.compress.DefaultCodec,
>
> org.apache.hadoop.io.compress.BZip2Codec,
>
> org.apache.hadoop.io.compress.SnappyCodec
> </value>
> </property>
> In mapred-site.xml added:
> <property>
> <name>mapreduce.map.output.compress</name>
> <value>true</value>
> </property>
>
> <property>
> <name>mapred.map.output.compress.codec</name>
>
> <value>org.apache.hadoop.io.compress.SnappyCodec</value>
> </property>
>
> <property>
> <name>mapreduce.admin.user.env</name>
>
> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
> </property>
> In yarn-site.xml added:
> <property>
> <name>yarn.app.mapreduce.am.env</name>
>
> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
> </property>
> Finally make the same setting in the datanode ,but it still didn't
> work 。But in a Pseudo-Distributed cluster, I only copy the libsnappy.so* to
> $HADOOP_HOME/lib/native without changing the configuration ,it works well
> .Why in this cluster it appears so difficult?
>
>
>
>
>
>
Re: Something about the Snappy Compression Tool
Posted by Tsuyoshi Ozawa <oz...@apache.org>.
Sometimes compiled native libraries included in tar ball doesn't work
correctly - how about recompiling library on your environment?
Thanks,
- Tsuyoshi
On Tue, Mar 24, 2015 at 6:18 PM, 王鹏飞 <wp...@gmail.com> wrote:
> I noticed a map-reduce job encountered an
> Exception:java.lang.UnsatisfiedLinkError:org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
> I googled it and released that it was lack of snappy tool.I
> Installed the snappy and copied libsnappy.so* to $HADOOP_HOME/lib/native
> In .bashrc added:
> export LD_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
> export JAVA_LIBRARY_PATH="$HADOOP_HOME/lib/native/"
> export
> HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native"
> In core-site.xml added:
> <property>
> <name>io.compression.codecs</name>
> <value>
>
> org.apache.hadoop.io.compress.GzipCodec,
>
> org.apache.hadoop.io.compress.DefaultCodec,
>
> org.apache.hadoop.io.compress.BZip2Codec,
>
> org.apache.hadoop.io.compress.SnappyCodec
> </value>
> </property>
> In mapred-site.xml added:
> <property>
> <name>mapreduce.map.output.compress</name>
> <value>true</value>
> </property>
>
> <property>
> <name>mapred.map.output.compress.codec</name>
>
> <value>org.apache.hadoop.io.compress.SnappyCodec</value>
> </property>
>
> <property>
> <name>mapreduce.admin.user.env</name>
>
> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
> </property>
> In yarn-site.xml added:
> <property>
> <name>yarn.app.mapreduce.am.env</name>
>
> <value>LD_LIBRARY_PATH=/home/hadoop/software/hadoop-2.4.1/lib/native</value>
> </property>
> Finally make the same setting in the datanode ,but it still didn't
> work 。But in a Pseudo-Distributed cluster, I only copy the libsnappy.so* to
> $HADOOP_HOME/lib/native without changing the configuration ,it works well
> .Why in this cluster it appears so difficult?
>
>
>
>
>
>