You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by peterm_second <re...@gmail.com> on 2014/12/10 14:57:35 UTC

Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Hi guys,
I have a hadoop + hbase + hive application,
For some reason my cluster is unable to find the snappy native library
Here is the exception :
  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
     at 
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native Method)
     at 
org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63) 

     at 
org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132) 

     at 
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
     at 
org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
     at org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
     at 
org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1583) 

     at 
org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
     at java.security.AccessController.doPrivileged(Native Method)
     at javax.security.auth.Subject.doAs(Subject.java:422)
     at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) 

     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)


I am working with a 64bit ubuntu 14.04LTS. I've installed snappy on my 
os and added the coppied the libs to hadoop_home/lib/native
I've also added the libs to the JRE, but it still fails as if nothing is 
present.
I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true $GC_DEBUG_OPTS 
-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native $HADOOP_OPTS"
in my yarn xml I have
<property>
       <name>yarn.app.mapreduce.am.env</name>
<value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
</property>

in my mapred-site.xml i have
<property>
         <name>mapred.child.java.opts</name>
         <value> 
-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
     </property>
     <property>
         <name>mapreduce.reduce.java.opts</name>
<value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
     </property>

The last two were a desperation move.
The result is always the same. Any ideas would be welcomed.

Thanks,
Peter







Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by peterm_second <re...@gmail.com>.
Hi Hanish,
Thanks for the link it did help. Long story short , always recompile 
native libraries for your machine :)

Thanks,
Peter

On 11.12.2014 05:46, Hanish Bansal wrote:
> Hope this may help you:
>
> http://blogs.impetus.com/big_data/big_data_technologies/SnappyCompressionInHBase.do
>
> On Thu, Dec 11, 2014 at 7:25 AM, Fabio <anytek88@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     Plain Apache Hadoop 2.5.0.
>     Too bad it didn't work, hope someone can help.
>
>
>     On 12/10/2014 06:22 PM, peterm_second wrote:
>
>         Hi Fabio ,
>         Thanks for the reply, but unfortunately it didn't work. I am
>         using vanilla hadoop 2.4 with vanilla hive 0.14 and so on, I
>         am using the vanilla distros.
>         I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't
>         make any change. What version were you using ?
>
>         Peter
>
>
>         On 10.12.2014 16:23, Fabio wrote:
>
>             Not sure it will help, but if the problem is native
>             library loading, I spent a loooong time trying anything to
>             make it work.
>             I may suggest to try also:
>             export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
>             export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
>             I have this both in the bash "init" script
>             (/etc/profile.p/...) and in
>             /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite
>             sure it's redundant, but as long as it works I don't
>             change it.
>             I see here I commented out my attempts to set HADOOP_OPTS,
>             so maybe it's not necessary.
>             I don't see anything in my .xml config files.
>             Also, someone says to compile the libraries under your 64
>             bit system, since the ones in Hadoop are for a 32bit
>             architecture.
>
>             Good luck
>
>             Fabio
>
>             On 12/10/2014 02:57 PM, peterm_second wrote:
>
>                 Hi guys,
>                 I have a hadoop + hbase + hive application,
>                 For some reason my cluster is unable to find the
>                 snappy native library
>                 Here is the exception :
>                  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>                     at
>                 org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
>                 Method)
>                     at
>                 org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>
>                     at
>                 org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>
>                     at
>                 org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>                     at
>                 org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>                     at
>                 org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>                     at
>                 org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1583)
>
>                     at
>                 org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
>                     at
>                 org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>                     at
>                 org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>                     at
>                 org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>                     at
>                 java.security.AccessController.doPrivileged(Native Method)
>                     at javax.security.auth.Subject.doAs(Subject.java:422)
>                     at
>                 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>
>                     at
>                 org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>
>
>                 I am working with a 64bit ubuntu 14.04LTS. I've
>                 installed snappy on my os and added the coppied the
>                 libs to hadoop_home/lib/native
>                 I've also added the libs to the JRE, but it still
>                 fails as if nothing is present.
>                 I've added 
>                 HADOOP_OPTS="-Djava.net.preferIPv4Stack=true
>                 $GC_DEBUG_OPTS
>                 -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native
>                 $HADOOP_OPTS"
>                 in my yarn xml I have
>                 <property>
>                       <name>yarn.app.mapreduce.am
>                 <http://yarn.app.mapreduce.am>.env</name>
>                 <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>                 </property>
>
>                 in my mapred-site.xml i have
>                 <property>
>                         <name>mapred.child.java.opts</name>
>                         <value>
>                 -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>                     </property>
>                     <property>
>                         <name>mapreduce.reduce.java.opts</name>
>                 <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>                     </property>
>
>                 The last two were a desperation move.
>                 The result is always the same. Any ideas would be
>                 welcomed.
>
>                 Thanks,
>                 Peter
>
>
>
>
>
>
>
>
>
>         .
>
>
>
>
>
> -- 
> *Thanks & Regards*
> *Hanish Bansal*


Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by peterm_second <re...@gmail.com>.
Hi Hanish,
Thanks for the link it did help. Long story short , always recompile 
native libraries for your machine :)

Thanks,
Peter

On 11.12.2014 05:46, Hanish Bansal wrote:
> Hope this may help you:
>
> http://blogs.impetus.com/big_data/big_data_technologies/SnappyCompressionInHBase.do
>
> On Thu, Dec 11, 2014 at 7:25 AM, Fabio <anytek88@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     Plain Apache Hadoop 2.5.0.
>     Too bad it didn't work, hope someone can help.
>
>
>     On 12/10/2014 06:22 PM, peterm_second wrote:
>
>         Hi Fabio ,
>         Thanks for the reply, but unfortunately it didn't work. I am
>         using vanilla hadoop 2.4 with vanilla hive 0.14 and so on, I
>         am using the vanilla distros.
>         I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't
>         make any change. What version were you using ?
>
>         Peter
>
>
>         On 10.12.2014 16:23, Fabio wrote:
>
>             Not sure it will help, but if the problem is native
>             library loading, I spent a loooong time trying anything to
>             make it work.
>             I may suggest to try also:
>             export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
>             export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
>             I have this both in the bash "init" script
>             (/etc/profile.p/...) and in
>             /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite
>             sure it's redundant, but as long as it works I don't
>             change it.
>             I see here I commented out my attempts to set HADOOP_OPTS,
>             so maybe it's not necessary.
>             I don't see anything in my .xml config files.
>             Also, someone says to compile the libraries under your 64
>             bit system, since the ones in Hadoop are for a 32bit
>             architecture.
>
>             Good luck
>
>             Fabio
>
>             On 12/10/2014 02:57 PM, peterm_second wrote:
>
>                 Hi guys,
>                 I have a hadoop + hbase + hive application,
>                 For some reason my cluster is unable to find the
>                 snappy native library
>                 Here is the exception :
>                  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>                     at
>                 org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
>                 Method)
>                     at
>                 org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>
>                     at
>                 org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>
>                     at
>                 org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>                     at
>                 org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>                     at
>                 org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>                     at
>                 org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1583)
>
>                     at
>                 org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
>                     at
>                 org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>                     at
>                 org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>                     at
>                 org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>                     at
>                 java.security.AccessController.doPrivileged(Native Method)
>                     at javax.security.auth.Subject.doAs(Subject.java:422)
>                     at
>                 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>
>                     at
>                 org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>
>
>                 I am working with a 64bit ubuntu 14.04LTS. I've
>                 installed snappy on my os and added the coppied the
>                 libs to hadoop_home/lib/native
>                 I've also added the libs to the JRE, but it still
>                 fails as if nothing is present.
>                 I've added 
>                 HADOOP_OPTS="-Djava.net.preferIPv4Stack=true
>                 $GC_DEBUG_OPTS
>                 -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native
>                 $HADOOP_OPTS"
>                 in my yarn xml I have
>                 <property>
>                       <name>yarn.app.mapreduce.am
>                 <http://yarn.app.mapreduce.am>.env</name>
>                 <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>                 </property>
>
>                 in my mapred-site.xml i have
>                 <property>
>                         <name>mapred.child.java.opts</name>
>                         <value>
>                 -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>                     </property>
>                     <property>
>                         <name>mapreduce.reduce.java.opts</name>
>                 <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>                     </property>
>
>                 The last two were a desperation move.
>                 The result is always the same. Any ideas would be
>                 welcomed.
>
>                 Thanks,
>                 Peter
>
>
>
>
>
>
>
>
>
>         .
>
>
>
>
>
> -- 
> *Thanks & Regards*
> *Hanish Bansal*


Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by peterm_second <re...@gmail.com>.
Hi Hanish,
Thanks for the link it did help. Long story short , always recompile 
native libraries for your machine :)

Thanks,
Peter

On 11.12.2014 05:46, Hanish Bansal wrote:
> Hope this may help you:
>
> http://blogs.impetus.com/big_data/big_data_technologies/SnappyCompressionInHBase.do
>
> On Thu, Dec 11, 2014 at 7:25 AM, Fabio <anytek88@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     Plain Apache Hadoop 2.5.0.
>     Too bad it didn't work, hope someone can help.
>
>
>     On 12/10/2014 06:22 PM, peterm_second wrote:
>
>         Hi Fabio ,
>         Thanks for the reply, but unfortunately it didn't work. I am
>         using vanilla hadoop 2.4 with vanilla hive 0.14 and so on, I
>         am using the vanilla distros.
>         I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't
>         make any change. What version were you using ?
>
>         Peter
>
>
>         On 10.12.2014 16:23, Fabio wrote:
>
>             Not sure it will help, but if the problem is native
>             library loading, I spent a loooong time trying anything to
>             make it work.
>             I may suggest to try also:
>             export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
>             export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
>             I have this both in the bash "init" script
>             (/etc/profile.p/...) and in
>             /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite
>             sure it's redundant, but as long as it works I don't
>             change it.
>             I see here I commented out my attempts to set HADOOP_OPTS,
>             so maybe it's not necessary.
>             I don't see anything in my .xml config files.
>             Also, someone says to compile the libraries under your 64
>             bit system, since the ones in Hadoop are for a 32bit
>             architecture.
>
>             Good luck
>
>             Fabio
>
>             On 12/10/2014 02:57 PM, peterm_second wrote:
>
>                 Hi guys,
>                 I have a hadoop + hbase + hive application,
>                 For some reason my cluster is unable to find the
>                 snappy native library
>                 Here is the exception :
>                  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>                     at
>                 org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
>                 Method)
>                     at
>                 org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>
>                     at
>                 org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>
>                     at
>                 org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>                     at
>                 org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>                     at
>                 org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>                     at
>                 org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1583)
>
>                     at
>                 org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
>                     at
>                 org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>                     at
>                 org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>                     at
>                 org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>                     at
>                 java.security.AccessController.doPrivileged(Native Method)
>                     at javax.security.auth.Subject.doAs(Subject.java:422)
>                     at
>                 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>
>                     at
>                 org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>
>
>                 I am working with a 64bit ubuntu 14.04LTS. I've
>                 installed snappy on my os and added the coppied the
>                 libs to hadoop_home/lib/native
>                 I've also added the libs to the JRE, but it still
>                 fails as if nothing is present.
>                 I've added 
>                 HADOOP_OPTS="-Djava.net.preferIPv4Stack=true
>                 $GC_DEBUG_OPTS
>                 -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native
>                 $HADOOP_OPTS"
>                 in my yarn xml I have
>                 <property>
>                       <name>yarn.app.mapreduce.am
>                 <http://yarn.app.mapreduce.am>.env</name>
>                 <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>                 </property>
>
>                 in my mapred-site.xml i have
>                 <property>
>                         <name>mapred.child.java.opts</name>
>                         <value>
>                 -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>                     </property>
>                     <property>
>                         <name>mapreduce.reduce.java.opts</name>
>                 <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>                     </property>
>
>                 The last two were a desperation move.
>                 The result is always the same. Any ideas would be
>                 welcomed.
>
>                 Thanks,
>                 Peter
>
>
>
>
>
>
>
>
>
>         .
>
>
>
>
>
> -- 
> *Thanks & Regards*
> *Hanish Bansal*


Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by peterm_second <re...@gmail.com>.
Hi Hanish,
Thanks for the link it did help. Long story short , always recompile 
native libraries for your machine :)

Thanks,
Peter

On 11.12.2014 05:46, Hanish Bansal wrote:
> Hope this may help you:
>
> http://blogs.impetus.com/big_data/big_data_technologies/SnappyCompressionInHBase.do
>
> On Thu, Dec 11, 2014 at 7:25 AM, Fabio <anytek88@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     Plain Apache Hadoop 2.5.0.
>     Too bad it didn't work, hope someone can help.
>
>
>     On 12/10/2014 06:22 PM, peterm_second wrote:
>
>         Hi Fabio ,
>         Thanks for the reply, but unfortunately it didn't work. I am
>         using vanilla hadoop 2.4 with vanilla hive 0.14 and so on, I
>         am using the vanilla distros.
>         I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't
>         make any change. What version were you using ?
>
>         Peter
>
>
>         On 10.12.2014 16:23, Fabio wrote:
>
>             Not sure it will help, but if the problem is native
>             library loading, I spent a loooong time trying anything to
>             make it work.
>             I may suggest to try also:
>             export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
>             export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
>             I have this both in the bash "init" script
>             (/etc/profile.p/...) and in
>             /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite
>             sure it's redundant, but as long as it works I don't
>             change it.
>             I see here I commented out my attempts to set HADOOP_OPTS,
>             so maybe it's not necessary.
>             I don't see anything in my .xml config files.
>             Also, someone says to compile the libraries under your 64
>             bit system, since the ones in Hadoop are for a 32bit
>             architecture.
>
>             Good luck
>
>             Fabio
>
>             On 12/10/2014 02:57 PM, peterm_second wrote:
>
>                 Hi guys,
>                 I have a hadoop + hbase + hive application,
>                 For some reason my cluster is unable to find the
>                 snappy native library
>                 Here is the exception :
>                  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>                     at
>                 org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
>                 Method)
>                     at
>                 org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>
>                     at
>                 org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>
>                     at
>                 org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>                     at
>                 org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>                     at
>                 org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>                     at
>                 org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1583)
>
>                     at
>                 org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
>                     at
>                 org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>                     at
>                 org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>                     at
>                 org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>                     at
>                 java.security.AccessController.doPrivileged(Native Method)
>                     at javax.security.auth.Subject.doAs(Subject.java:422)
>                     at
>                 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>
>                     at
>                 org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>
>
>                 I am working with a 64bit ubuntu 14.04LTS. I've
>                 installed snappy on my os and added the coppied the
>                 libs to hadoop_home/lib/native
>                 I've also added the libs to the JRE, but it still
>                 fails as if nothing is present.
>                 I've added 
>                 HADOOP_OPTS="-Djava.net.preferIPv4Stack=true
>                 $GC_DEBUG_OPTS
>                 -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native
>                 $HADOOP_OPTS"
>                 in my yarn xml I have
>                 <property>
>                       <name>yarn.app.mapreduce.am
>                 <http://yarn.app.mapreduce.am>.env</name>
>                 <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>                 </property>
>
>                 in my mapred-site.xml i have
>                 <property>
>                         <name>mapred.child.java.opts</name>
>                         <value>
>                 -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>                     </property>
>                     <property>
>                         <name>mapreduce.reduce.java.opts</name>
>                 <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>                     </property>
>
>                 The last two were a desperation move.
>                 The result is always the same. Any ideas would be
>                 welcomed.
>
>                 Thanks,
>                 Peter
>
>
>
>
>
>
>
>
>
>         .
>
>
>
>
>
> -- 
> *Thanks & Regards*
> *Hanish Bansal*


Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by Hanish Bansal <ha...@gmail.com>.
Hope this may help you:

http://blogs.impetus.com/big_data/big_data_technologies/SnappyCompressionInHBase.do

On Thu, Dec 11, 2014 at 7:25 AM, Fabio <an...@gmail.com> wrote:

> Plain Apache Hadoop 2.5.0.
> Too bad it didn't work, hope someone can help.
>
>
> On 12/10/2014 06:22 PM, peterm_second wrote:
>
>> Hi Fabio ,
>> Thanks for the reply, but unfortunately it didn't work. I am using
>> vanilla hadoop 2.4 with vanilla hive 0.14 and so on, I am using the vanilla
>> distros.
>> I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't make any
>> change. What version were you using ?
>>
>> Peter
>>
>>
>> On 10.12.2014 16:23, Fabio wrote:
>>
>>> Not sure it will help, but if the problem is native library loading, I
>>> spent a loooong time trying anything to make it work.
>>> I may suggest to try also:
>>> export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
>>> export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
>>> I have this both in the bash "init" script (/etc/profile.p/...) and in
>>> /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure it's
>>> redundant, but as long as it works I don't change it.
>>> I see here I commented out my attempts to set HADOOP_OPTS, so maybe it's
>>> not necessary.
>>> I don't see anything in my .xml config files.
>>> Also, someone says to compile the libraries under your 64 bit system,
>>> since the ones in Hadoop are for a 32bit architecture.
>>>
>>> Good luck
>>>
>>> Fabio
>>>
>>> On 12/10/2014 02:57 PM, peterm_second wrote:
>>>
>>>> Hi guys,
>>>> I have a hadoop + hbase + hive application,
>>>> For some reason my cluster is unable to find the snappy native library
>>>> Here is the exception :
>>>>  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>>>>     at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
>>>> Method)
>>>>     at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>>>>
>>>>     at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>>>>
>>>>     at org.apache.hadoop.io.compress.CodecPool.getCompressor(
>>>> CodecPool.java:148)
>>>>     at org.apache.hadoop.io.compress.CodecPool.getCompressor(
>>>> CodecPool.java:163)
>>>>     at org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>>>>     at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.
>>>> sortAndSpill(MapTask.java:1583)
>>>>     at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(
>>>> MapTask.java:1462)
>>>>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>     at javax.security.auth.Subject.doAs(Subject.java:422)
>>>>     at org.apache.hadoop.security.UserGroupInformation.doAs(
>>>> UserGroupInformation.java:1548)
>>>>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>>>>
>>>>
>>>> I am working with a 64bit ubuntu 14.04LTS. I've installed snappy on my
>>>> os and added the coppied the libs to hadoop_home/lib/native
>>>> I've also added the libs to the JRE, but it still fails as if nothing
>>>> is present.
>>>> I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true
>>>> $GC_DEBUG_OPTS -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native
>>>> $HADOOP_OPTS"
>>>> in my yarn xml I have
>>>> <property>
>>>>       <name>yarn.app.mapreduce.am.env</name>
>>>> <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>>>> </property>
>>>>
>>>> in my mapred-site.xml i have
>>>> <property>
>>>>         <name>mapred.child.java.opts</name>
>>>>         <value> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native<
>>>> /value>
>>>>     </property>
>>>>     <property>
>>>>         <name>mapreduce.reduce.java.opts</name>
>>>> <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>>>     </property>
>>>>
>>>> The last two were a desperation move.
>>>> The result is always the same. Any ideas would be welcomed.
>>>>
>>>> Thanks,
>>>> Peter
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>> .
>>
>>
>


-- 
*Thanks & Regards*
*Hanish Bansal*

Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by Hanish Bansal <ha...@gmail.com>.
Hope this may help you:

http://blogs.impetus.com/big_data/big_data_technologies/SnappyCompressionInHBase.do

On Thu, Dec 11, 2014 at 7:25 AM, Fabio <an...@gmail.com> wrote:

> Plain Apache Hadoop 2.5.0.
> Too bad it didn't work, hope someone can help.
>
>
> On 12/10/2014 06:22 PM, peterm_second wrote:
>
>> Hi Fabio ,
>> Thanks for the reply, but unfortunately it didn't work. I am using
>> vanilla hadoop 2.4 with vanilla hive 0.14 and so on, I am using the vanilla
>> distros.
>> I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't make any
>> change. What version were you using ?
>>
>> Peter
>>
>>
>> On 10.12.2014 16:23, Fabio wrote:
>>
>>> Not sure it will help, but if the problem is native library loading, I
>>> spent a loooong time trying anything to make it work.
>>> I may suggest to try also:
>>> export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
>>> export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
>>> I have this both in the bash "init" script (/etc/profile.p/...) and in
>>> /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure it's
>>> redundant, but as long as it works I don't change it.
>>> I see here I commented out my attempts to set HADOOP_OPTS, so maybe it's
>>> not necessary.
>>> I don't see anything in my .xml config files.
>>> Also, someone says to compile the libraries under your 64 bit system,
>>> since the ones in Hadoop are for a 32bit architecture.
>>>
>>> Good luck
>>>
>>> Fabio
>>>
>>> On 12/10/2014 02:57 PM, peterm_second wrote:
>>>
>>>> Hi guys,
>>>> I have a hadoop + hbase + hive application,
>>>> For some reason my cluster is unable to find the snappy native library
>>>> Here is the exception :
>>>>  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>>>>     at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
>>>> Method)
>>>>     at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>>>>
>>>>     at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>>>>
>>>>     at org.apache.hadoop.io.compress.CodecPool.getCompressor(
>>>> CodecPool.java:148)
>>>>     at org.apache.hadoop.io.compress.CodecPool.getCompressor(
>>>> CodecPool.java:163)
>>>>     at org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>>>>     at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.
>>>> sortAndSpill(MapTask.java:1583)
>>>>     at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(
>>>> MapTask.java:1462)
>>>>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>     at javax.security.auth.Subject.doAs(Subject.java:422)
>>>>     at org.apache.hadoop.security.UserGroupInformation.doAs(
>>>> UserGroupInformation.java:1548)
>>>>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>>>>
>>>>
>>>> I am working with a 64bit ubuntu 14.04LTS. I've installed snappy on my
>>>> os and added the coppied the libs to hadoop_home/lib/native
>>>> I've also added the libs to the JRE, but it still fails as if nothing
>>>> is present.
>>>> I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true
>>>> $GC_DEBUG_OPTS -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native
>>>> $HADOOP_OPTS"
>>>> in my yarn xml I have
>>>> <property>
>>>>       <name>yarn.app.mapreduce.am.env</name>
>>>> <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>>>> </property>
>>>>
>>>> in my mapred-site.xml i have
>>>> <property>
>>>>         <name>mapred.child.java.opts</name>
>>>>         <value> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native<
>>>> /value>
>>>>     </property>
>>>>     <property>
>>>>         <name>mapreduce.reduce.java.opts</name>
>>>> <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>>>     </property>
>>>>
>>>> The last two were a desperation move.
>>>> The result is always the same. Any ideas would be welcomed.
>>>>
>>>> Thanks,
>>>> Peter
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>> .
>>
>>
>


-- 
*Thanks & Regards*
*Hanish Bansal*

Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by Hanish Bansal <ha...@gmail.com>.
Hope this may help you:

http://blogs.impetus.com/big_data/big_data_technologies/SnappyCompressionInHBase.do

On Thu, Dec 11, 2014 at 7:25 AM, Fabio <an...@gmail.com> wrote:

> Plain Apache Hadoop 2.5.0.
> Too bad it didn't work, hope someone can help.
>
>
> On 12/10/2014 06:22 PM, peterm_second wrote:
>
>> Hi Fabio ,
>> Thanks for the reply, but unfortunately it didn't work. I am using
>> vanilla hadoop 2.4 with vanilla hive 0.14 and so on, I am using the vanilla
>> distros.
>> I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't make any
>> change. What version were you using ?
>>
>> Peter
>>
>>
>> On 10.12.2014 16:23, Fabio wrote:
>>
>>> Not sure it will help, but if the problem is native library loading, I
>>> spent a loooong time trying anything to make it work.
>>> I may suggest to try also:
>>> export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
>>> export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
>>> I have this both in the bash "init" script (/etc/profile.p/...) and in
>>> /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure it's
>>> redundant, but as long as it works I don't change it.
>>> I see here I commented out my attempts to set HADOOP_OPTS, so maybe it's
>>> not necessary.
>>> I don't see anything in my .xml config files.
>>> Also, someone says to compile the libraries under your 64 bit system,
>>> since the ones in Hadoop are for a 32bit architecture.
>>>
>>> Good luck
>>>
>>> Fabio
>>>
>>> On 12/10/2014 02:57 PM, peterm_second wrote:
>>>
>>>> Hi guys,
>>>> I have a hadoop + hbase + hive application,
>>>> For some reason my cluster is unable to find the snappy native library
>>>> Here is the exception :
>>>>  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>>>>     at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
>>>> Method)
>>>>     at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>>>>
>>>>     at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>>>>
>>>>     at org.apache.hadoop.io.compress.CodecPool.getCompressor(
>>>> CodecPool.java:148)
>>>>     at org.apache.hadoop.io.compress.CodecPool.getCompressor(
>>>> CodecPool.java:163)
>>>>     at org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>>>>     at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.
>>>> sortAndSpill(MapTask.java:1583)
>>>>     at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(
>>>> MapTask.java:1462)
>>>>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>     at javax.security.auth.Subject.doAs(Subject.java:422)
>>>>     at org.apache.hadoop.security.UserGroupInformation.doAs(
>>>> UserGroupInformation.java:1548)
>>>>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>>>>
>>>>
>>>> I am working with a 64bit ubuntu 14.04LTS. I've installed snappy on my
>>>> os and added the coppied the libs to hadoop_home/lib/native
>>>> I've also added the libs to the JRE, but it still fails as if nothing
>>>> is present.
>>>> I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true
>>>> $GC_DEBUG_OPTS -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native
>>>> $HADOOP_OPTS"
>>>> in my yarn xml I have
>>>> <property>
>>>>       <name>yarn.app.mapreduce.am.env</name>
>>>> <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>>>> </property>
>>>>
>>>> in my mapred-site.xml i have
>>>> <property>
>>>>         <name>mapred.child.java.opts</name>
>>>>         <value> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native<
>>>> /value>
>>>>     </property>
>>>>     <property>
>>>>         <name>mapreduce.reduce.java.opts</name>
>>>> <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>>>     </property>
>>>>
>>>> The last two were a desperation move.
>>>> The result is always the same. Any ideas would be welcomed.
>>>>
>>>> Thanks,
>>>> Peter
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>> .
>>
>>
>


-- 
*Thanks & Regards*
*Hanish Bansal*

Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by Hanish Bansal <ha...@gmail.com>.
Hope this may help you:

http://blogs.impetus.com/big_data/big_data_technologies/SnappyCompressionInHBase.do

On Thu, Dec 11, 2014 at 7:25 AM, Fabio <an...@gmail.com> wrote:

> Plain Apache Hadoop 2.5.0.
> Too bad it didn't work, hope someone can help.
>
>
> On 12/10/2014 06:22 PM, peterm_second wrote:
>
>> Hi Fabio ,
>> Thanks for the reply, but unfortunately it didn't work. I am using
>> vanilla hadoop 2.4 with vanilla hive 0.14 and so on, I am using the vanilla
>> distros.
>> I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't make any
>> change. What version were you using ?
>>
>> Peter
>>
>>
>> On 10.12.2014 16:23, Fabio wrote:
>>
>>> Not sure it will help, but if the problem is native library loading, I
>>> spent a loooong time trying anything to make it work.
>>> I may suggest to try also:
>>> export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
>>> export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
>>> I have this both in the bash "init" script (/etc/profile.p/...) and in
>>> /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure it's
>>> redundant, but as long as it works I don't change it.
>>> I see here I commented out my attempts to set HADOOP_OPTS, so maybe it's
>>> not necessary.
>>> I don't see anything in my .xml config files.
>>> Also, someone says to compile the libraries under your 64 bit system,
>>> since the ones in Hadoop are for a 32bit architecture.
>>>
>>> Good luck
>>>
>>> Fabio
>>>
>>> On 12/10/2014 02:57 PM, peterm_second wrote:
>>>
>>>> Hi guys,
>>>> I have a hadoop + hbase + hive application,
>>>> For some reason my cluster is unable to find the snappy native library
>>>> Here is the exception :
>>>>  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>>>>     at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
>>>> Method)
>>>>     at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>>>>
>>>>     at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>>>>
>>>>     at org.apache.hadoop.io.compress.CodecPool.getCompressor(
>>>> CodecPool.java:148)
>>>>     at org.apache.hadoop.io.compress.CodecPool.getCompressor(
>>>> CodecPool.java:163)
>>>>     at org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>>>>     at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.
>>>> sortAndSpill(MapTask.java:1583)
>>>>     at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(
>>>> MapTask.java:1462)
>>>>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>>     at javax.security.auth.Subject.doAs(Subject.java:422)
>>>>     at org.apache.hadoop.security.UserGroupInformation.doAs(
>>>> UserGroupInformation.java:1548)
>>>>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>>>>
>>>>
>>>> I am working with a 64bit ubuntu 14.04LTS. I've installed snappy on my
>>>> os and added the coppied the libs to hadoop_home/lib/native
>>>> I've also added the libs to the JRE, but it still fails as if nothing
>>>> is present.
>>>> I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true
>>>> $GC_DEBUG_OPTS -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native
>>>> $HADOOP_OPTS"
>>>> in my yarn xml I have
>>>> <property>
>>>>       <name>yarn.app.mapreduce.am.env</name>
>>>> <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>>>> </property>
>>>>
>>>> in my mapred-site.xml i have
>>>> <property>
>>>>         <name>mapred.child.java.opts</name>
>>>>         <value> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native<
>>>> /value>
>>>>     </property>
>>>>     <property>
>>>>         <name>mapreduce.reduce.java.opts</name>
>>>> <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>>>     </property>
>>>>
>>>> The last two were a desperation move.
>>>> The result is always the same. Any ideas would be welcomed.
>>>>
>>>> Thanks,
>>>> Peter
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>
>> .
>>
>>
>


-- 
*Thanks & Regards*
*Hanish Bansal*

Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by Fabio <an...@gmail.com>.
Plain Apache Hadoop 2.5.0.
Too bad it didn't work, hope someone can help.

On 12/10/2014 06:22 PM, peterm_second wrote:
> Hi Fabio ,
> Thanks for the reply, but unfortunately it didn't work. I am using 
> vanilla hadoop 2.4 with vanilla hive 0.14 and so on, I am using the 
> vanilla distros.
> I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't make any 
> change. What version were you using ?
>
> Peter
>
>
> On 10.12.2014 16:23, Fabio wrote:
>> Not sure it will help, but if the problem is native library loading, 
>> I spent a loooong time trying anything to make it work.
>> I may suggest to try also:
>> export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
>> export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
>> I have this both in the bash "init" script (/etc/profile.p/...) and 
>> in /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure it's 
>> redundant, but as long as it works I don't change it.
>> I see here I commented out my attempts to set HADOOP_OPTS, so maybe 
>> it's not necessary.
>> I don't see anything in my .xml config files.
>> Also, someone says to compile the libraries under your 64 bit system, 
>> since the ones in Hadoop are for a 32bit architecture.
>>
>> Good luck
>>
>> Fabio
>>
>> On 12/10/2014 02:57 PM, peterm_second wrote:
>>> Hi guys,
>>> I have a hadoop + hbase + hive application,
>>> For some reason my cluster is unable to find the snappy native library
>>> Here is the exception :
>>>  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>>>     at 
>>> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native 
>>> Method)
>>>     at 
>>> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63) 
>>>
>>>     at 
>>> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132) 
>>>
>>>     at 
>>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>>>     at 
>>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>>>     at org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>>>     at 
>>> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1583) 
>>>
>>>     at 
>>> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
>>>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>     at javax.security.auth.Subject.doAs(Subject.java:422)
>>>     at 
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) 
>>>
>>>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>>>
>>>
>>> I am working with a 64bit ubuntu 14.04LTS. I've installed snappy on 
>>> my os and added the coppied the libs to hadoop_home/lib/native
>>> I've also added the libs to the JRE, but it still fails as if 
>>> nothing is present.
>>> I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true 
>>> $GC_DEBUG_OPTS 
>>> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native $HADOOP_OPTS"
>>> in my yarn xml I have
>>> <property>
>>>       <name>yarn.app.mapreduce.am.env</name>
>>> <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>>> </property>
>>>
>>> in my mapred-site.xml i have
>>> <property>
>>>         <name>mapred.child.java.opts</name>
>>>         <value> 
>>> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>>     </property>
>>>     <property>
>>>         <name>mapreduce.reduce.java.opts</name>
>>> <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>>     </property>
>>>
>>> The last two were a desperation move.
>>> The result is always the same. Any ideas would be welcomed.
>>>
>>> Thanks,
>>> Peter
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>
> .
>


Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by peterm_second <re...@gmail.com>.
Sorry I should've mentioned this. I've installed snappy lib using 
apt-get , my hadoop had no snappy  support build in.

Peter


On 10.12.2014 19:28, Ted Yu wrote:
> See:
> https://issues.apache.org/jira/browse/HADOOP-9911
>
> Can you recompile snappy for 64-bit system ?
>
> Cheers
>
> On Wed, Dec 10, 2014 at 9:22 AM, peterm_second <regestrer@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     Hi Fabio ,
>     Thanks for the reply, but unfortunately it didn't work. I am using
>     vanilla hadoop 2.4 with vanilla hive 0.14 and so on, I am using
>     the vanilla distros.
>     I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't make
>     any change. What version were you using ?
>
>     Peter
>
>
>
>     On 10.12.2014 16:23, Fabio wrote:
>
>         Not sure it will help, but if the problem is native library
>         loading, I spent a loooong time trying anything to make it work.
>         I may suggest to try also:
>         export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
>         export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
>         I have this both in the bash "init" script
>         (/etc/profile.p/...) and in
>         /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure
>         it's redundant, but as long as it works I don't change it.
>         I see here I commented out my attempts to set HADOOP_OPTS, so
>         maybe it's not necessary.
>         I don't see anything in my .xml config files.
>         Also, someone says to compile the libraries under your 64 bit
>         system, since the ones in Hadoop are for a 32bit architecture.
>
>         Good luck
>
>         Fabio
>
>         On 12/10/2014 02:57 PM, peterm_second wrote:
>
>             Hi guys,
>             I have a hadoop + hbase + hive application,
>             For some reason my cluster is unable to find the snappy
>             native library
>             Here is the exception :
>              org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>                 at
>             org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
>             Method)
>                 at
>             org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>
>                 at
>             org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>
>                 at
>             org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>                 at
>             org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>                 at
>             org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>                 at
>             org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1583)
>
>                 at
>             org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
>                 at
>             org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>                 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>                 at
>             org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>                 at java.security.AccessController.doPrivileged(Native
>             Method)
>                 at javax.security.auth.Subject.doAs(Subject.java:422)
>                 at
>             org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>
>                 at
>             org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>
>
>             I am working with a 64bit ubuntu 14.04LTS. I've installed
>             snappy on my os and added the coppied the libs to
>             hadoop_home/lib/native
>             I've also added the libs to the JRE, but it still fails as
>             if nothing is present.
>             I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true
>             $GC_DEBUG_OPTS
>             -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native
>             $HADOOP_OPTS"
>             in my yarn xml I have
>             <property>
>                   <name>yarn.app.mapreduce.am
>             <http://yarn.app.mapreduce.am>.env</name>
>             <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>             </property>
>
>             in my mapred-site.xml i have
>             <property>
>                     <name>mapred.child.java.opts</name>
>                     <value>
>             -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>                 </property>
>                 <property>
>                     <name>mapreduce.reduce.java.opts</name>
>             <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>                 </property>
>
>             The last two were a desperation move.
>             The result is always the same. Any ideas would be welcomed.
>
>             Thanks,
>             Peter
>
>
>
>
>
>
>
>
>
>


Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by peterm_second <re...@gmail.com>.
Sorry I should've mentioned this. I've installed snappy lib using 
apt-get , my hadoop had no snappy  support build in.

Peter


On 10.12.2014 19:28, Ted Yu wrote:
> See:
> https://issues.apache.org/jira/browse/HADOOP-9911
>
> Can you recompile snappy for 64-bit system ?
>
> Cheers
>
> On Wed, Dec 10, 2014 at 9:22 AM, peterm_second <regestrer@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     Hi Fabio ,
>     Thanks for the reply, but unfortunately it didn't work. I am using
>     vanilla hadoop 2.4 with vanilla hive 0.14 and so on, I am using
>     the vanilla distros.
>     I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't make
>     any change. What version were you using ?
>
>     Peter
>
>
>
>     On 10.12.2014 16:23, Fabio wrote:
>
>         Not sure it will help, but if the problem is native library
>         loading, I spent a loooong time trying anything to make it work.
>         I may suggest to try also:
>         export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
>         export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
>         I have this both in the bash "init" script
>         (/etc/profile.p/...) and in
>         /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure
>         it's redundant, but as long as it works I don't change it.
>         I see here I commented out my attempts to set HADOOP_OPTS, so
>         maybe it's not necessary.
>         I don't see anything in my .xml config files.
>         Also, someone says to compile the libraries under your 64 bit
>         system, since the ones in Hadoop are for a 32bit architecture.
>
>         Good luck
>
>         Fabio
>
>         On 12/10/2014 02:57 PM, peterm_second wrote:
>
>             Hi guys,
>             I have a hadoop + hbase + hive application,
>             For some reason my cluster is unable to find the snappy
>             native library
>             Here is the exception :
>              org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>                 at
>             org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
>             Method)
>                 at
>             org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>
>                 at
>             org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>
>                 at
>             org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>                 at
>             org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>                 at
>             org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>                 at
>             org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1583)
>
>                 at
>             org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
>                 at
>             org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>                 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>                 at
>             org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>                 at java.security.AccessController.doPrivileged(Native
>             Method)
>                 at javax.security.auth.Subject.doAs(Subject.java:422)
>                 at
>             org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>
>                 at
>             org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>
>
>             I am working with a 64bit ubuntu 14.04LTS. I've installed
>             snappy on my os and added the coppied the libs to
>             hadoop_home/lib/native
>             I've also added the libs to the JRE, but it still fails as
>             if nothing is present.
>             I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true
>             $GC_DEBUG_OPTS
>             -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native
>             $HADOOP_OPTS"
>             in my yarn xml I have
>             <property>
>                   <name>yarn.app.mapreduce.am
>             <http://yarn.app.mapreduce.am>.env</name>
>             <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>             </property>
>
>             in my mapred-site.xml i have
>             <property>
>                     <name>mapred.child.java.opts</name>
>                     <value>
>             -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>                 </property>
>                 <property>
>                     <name>mapreduce.reduce.java.opts</name>
>             <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>                 </property>
>
>             The last two were a desperation move.
>             The result is always the same. Any ideas would be welcomed.
>
>             Thanks,
>             Peter
>
>
>
>
>
>
>
>
>
>


Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by peterm_second <re...@gmail.com>.
Sorry I should've mentioned this. I've installed snappy lib using 
apt-get , my hadoop had no snappy  support build in.

Peter


On 10.12.2014 19:28, Ted Yu wrote:
> See:
> https://issues.apache.org/jira/browse/HADOOP-9911
>
> Can you recompile snappy for 64-bit system ?
>
> Cheers
>
> On Wed, Dec 10, 2014 at 9:22 AM, peterm_second <regestrer@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     Hi Fabio ,
>     Thanks for the reply, but unfortunately it didn't work. I am using
>     vanilla hadoop 2.4 with vanilla hive 0.14 and so on, I am using
>     the vanilla distros.
>     I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't make
>     any change. What version were you using ?
>
>     Peter
>
>
>
>     On 10.12.2014 16:23, Fabio wrote:
>
>         Not sure it will help, but if the problem is native library
>         loading, I spent a loooong time trying anything to make it work.
>         I may suggest to try also:
>         export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
>         export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
>         I have this both in the bash "init" script
>         (/etc/profile.p/...) and in
>         /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure
>         it's redundant, but as long as it works I don't change it.
>         I see here I commented out my attempts to set HADOOP_OPTS, so
>         maybe it's not necessary.
>         I don't see anything in my .xml config files.
>         Also, someone says to compile the libraries under your 64 bit
>         system, since the ones in Hadoop are for a 32bit architecture.
>
>         Good luck
>
>         Fabio
>
>         On 12/10/2014 02:57 PM, peterm_second wrote:
>
>             Hi guys,
>             I have a hadoop + hbase + hive application,
>             For some reason my cluster is unable to find the snappy
>             native library
>             Here is the exception :
>              org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>                 at
>             org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
>             Method)
>                 at
>             org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>
>                 at
>             org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>
>                 at
>             org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>                 at
>             org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>                 at
>             org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>                 at
>             org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1583)
>
>                 at
>             org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
>                 at
>             org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>                 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>                 at
>             org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>                 at java.security.AccessController.doPrivileged(Native
>             Method)
>                 at javax.security.auth.Subject.doAs(Subject.java:422)
>                 at
>             org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>
>                 at
>             org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>
>
>             I am working with a 64bit ubuntu 14.04LTS. I've installed
>             snappy on my os and added the coppied the libs to
>             hadoop_home/lib/native
>             I've also added the libs to the JRE, but it still fails as
>             if nothing is present.
>             I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true
>             $GC_DEBUG_OPTS
>             -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native
>             $HADOOP_OPTS"
>             in my yarn xml I have
>             <property>
>                   <name>yarn.app.mapreduce.am
>             <http://yarn.app.mapreduce.am>.env</name>
>             <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>             </property>
>
>             in my mapred-site.xml i have
>             <property>
>                     <name>mapred.child.java.opts</name>
>                     <value>
>             -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>                 </property>
>                 <property>
>                     <name>mapreduce.reduce.java.opts</name>
>             <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>                 </property>
>
>             The last two were a desperation move.
>             The result is always the same. Any ideas would be welcomed.
>
>             Thanks,
>             Peter
>
>
>
>
>
>
>
>
>
>


Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by peterm_second <re...@gmail.com>.
Sorry I should've mentioned this. I've installed snappy lib using 
apt-get , my hadoop had no snappy  support build in.

Peter


On 10.12.2014 19:28, Ted Yu wrote:
> See:
> https://issues.apache.org/jira/browse/HADOOP-9911
>
> Can you recompile snappy for 64-bit system ?
>
> Cheers
>
> On Wed, Dec 10, 2014 at 9:22 AM, peterm_second <regestrer@gmail.com 
> <ma...@gmail.com>> wrote:
>
>     Hi Fabio ,
>     Thanks for the reply, but unfortunately it didn't work. I am using
>     vanilla hadoop 2.4 with vanilla hive 0.14 and so on, I am using
>     the vanilla distros.
>     I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't make
>     any change. What version were you using ?
>
>     Peter
>
>
>
>     On 10.12.2014 16:23, Fabio wrote:
>
>         Not sure it will help, but if the problem is native library
>         loading, I spent a loooong time trying anything to make it work.
>         I may suggest to try also:
>         export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
>         export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
>         I have this both in the bash "init" script
>         (/etc/profile.p/...) and in
>         /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure
>         it's redundant, but as long as it works I don't change it.
>         I see here I commented out my attempts to set HADOOP_OPTS, so
>         maybe it's not necessary.
>         I don't see anything in my .xml config files.
>         Also, someone says to compile the libraries under your 64 bit
>         system, since the ones in Hadoop are for a 32bit architecture.
>
>         Good luck
>
>         Fabio
>
>         On 12/10/2014 02:57 PM, peterm_second wrote:
>
>             Hi guys,
>             I have a hadoop + hbase + hive application,
>             For some reason my cluster is unable to find the snappy
>             native library
>             Here is the exception :
>              org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>                 at
>             org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
>             Method)
>                 at
>             org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>
>                 at
>             org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>
>                 at
>             org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>                 at
>             org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>                 at
>             org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>                 at
>             org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1583)
>
>                 at
>             org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
>                 at
>             org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>                 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>                 at
>             org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>                 at java.security.AccessController.doPrivileged(Native
>             Method)
>                 at javax.security.auth.Subject.doAs(Subject.java:422)
>                 at
>             org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)
>
>                 at
>             org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>
>
>             I am working with a 64bit ubuntu 14.04LTS. I've installed
>             snappy on my os and added the coppied the libs to
>             hadoop_home/lib/native
>             I've also added the libs to the JRE, but it still fails as
>             if nothing is present.
>             I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true
>             $GC_DEBUG_OPTS
>             -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native
>             $HADOOP_OPTS"
>             in my yarn xml I have
>             <property>
>                   <name>yarn.app.mapreduce.am
>             <http://yarn.app.mapreduce.am>.env</name>
>             <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>             </property>
>
>             in my mapred-site.xml i have
>             <property>
>                     <name>mapred.child.java.opts</name>
>                     <value>
>             -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>                 </property>
>                 <property>
>                     <name>mapreduce.reduce.java.opts</name>
>             <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>                 </property>
>
>             The last two were a desperation move.
>             The result is always the same. Any ideas would be welcomed.
>
>             Thanks,
>             Peter
>
>
>
>
>
>
>
>
>
>


Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by Ted Yu <yu...@gmail.com>.
See:
https://issues.apache.org/jira/browse/HADOOP-9911

Can you recompile snappy for 64-bit system ?

Cheers

On Wed, Dec 10, 2014 at 9:22 AM, peterm_second <re...@gmail.com> wrote:

> Hi Fabio ,
> Thanks for the reply, but unfortunately it didn't work. I am using vanilla
> hadoop 2.4 with vanilla hive 0.14 and so on, I am using the vanilla distros.
> I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't make any
> change. What version were you using ?
>
> Peter
>
>
>
> On 10.12.2014 16:23, Fabio wrote:
>
>> Not sure it will help, but if the problem is native library loading, I
>> spent a loooong time trying anything to make it work.
>> I may suggest to try also:
>> export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
>> export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
>> I have this both in the bash "init" script (/etc/profile.p/...) and in
>> /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure it's
>> redundant, but as long as it works I don't change it.
>> I see here I commented out my attempts to set HADOOP_OPTS, so maybe it's
>> not necessary.
>> I don't see anything in my .xml config files.
>> Also, someone says to compile the libraries under your 64 bit system,
>> since the ones in Hadoop are for a 32bit architecture.
>>
>> Good luck
>>
>> Fabio
>>
>> On 12/10/2014 02:57 PM, peterm_second wrote:
>>
>>> Hi guys,
>>> I have a hadoop + hbase + hive application,
>>> For some reason my cluster is unable to find the snappy native library
>>> Here is the exception :
>>>  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>>>     at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
>>> Method)
>>>     at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>>>
>>>     at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>>>
>>>     at org.apache.hadoop.io.compress.CodecPool.getCompressor(
>>> CodecPool.java:148)
>>>     at org.apache.hadoop.io.compress.CodecPool.getCompressor(
>>> CodecPool.java:163)
>>>     at org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>>>     at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.
>>> sortAndSpill(MapTask.java:1583)
>>>     at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(
>>> MapTask.java:1462)
>>>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>     at javax.security.auth.Subject.doAs(Subject.java:422)
>>>     at org.apache.hadoop.security.UserGroupInformation.doAs(
>>> UserGroupInformation.java:1548)
>>>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>>>
>>>
>>> I am working with a 64bit ubuntu 14.04LTS. I've installed snappy on my
>>> os and added the coppied the libs to hadoop_home/lib/native
>>> I've also added the libs to the JRE, but it still fails as if nothing is
>>> present.
>>> I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true $GC_DEBUG_OPTS
>>> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native $HADOOP_OPTS"
>>> in my yarn xml I have
>>> <property>
>>>       <name>yarn.app.mapreduce.am.env</name>
>>> <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>>> </property>
>>>
>>> in my mapred-site.xml i have
>>> <property>
>>>         <name>mapred.child.java.opts</name>
>>>         <value> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native<
>>> /value>
>>>     </property>
>>>     <property>
>>>         <name>mapreduce.reduce.java.opts</name>
>>> <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>>     </property>
>>>
>>> The last two were a desperation move.
>>> The result is always the same. Any ideas would be welcomed.
>>>
>>> Thanks,
>>> Peter
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>

Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by Ted Yu <yu...@gmail.com>.
See:
https://issues.apache.org/jira/browse/HADOOP-9911

Can you recompile snappy for 64-bit system ?

Cheers

On Wed, Dec 10, 2014 at 9:22 AM, peterm_second <re...@gmail.com> wrote:

> Hi Fabio ,
> Thanks for the reply, but unfortunately it didn't work. I am using vanilla
> hadoop 2.4 with vanilla hive 0.14 and so on, I am using the vanilla distros.
> I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't make any
> change. What version were you using ?
>
> Peter
>
>
>
> On 10.12.2014 16:23, Fabio wrote:
>
>> Not sure it will help, but if the problem is native library loading, I
>> spent a loooong time trying anything to make it work.
>> I may suggest to try also:
>> export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
>> export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
>> I have this both in the bash "init" script (/etc/profile.p/...) and in
>> /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure it's
>> redundant, but as long as it works I don't change it.
>> I see here I commented out my attempts to set HADOOP_OPTS, so maybe it's
>> not necessary.
>> I don't see anything in my .xml config files.
>> Also, someone says to compile the libraries under your 64 bit system,
>> since the ones in Hadoop are for a 32bit architecture.
>>
>> Good luck
>>
>> Fabio
>>
>> On 12/10/2014 02:57 PM, peterm_second wrote:
>>
>>> Hi guys,
>>> I have a hadoop + hbase + hive application,
>>> For some reason my cluster is unable to find the snappy native library
>>> Here is the exception :
>>>  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>>>     at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
>>> Method)
>>>     at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>>>
>>>     at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>>>
>>>     at org.apache.hadoop.io.compress.CodecPool.getCompressor(
>>> CodecPool.java:148)
>>>     at org.apache.hadoop.io.compress.CodecPool.getCompressor(
>>> CodecPool.java:163)
>>>     at org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>>>     at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.
>>> sortAndSpill(MapTask.java:1583)
>>>     at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(
>>> MapTask.java:1462)
>>>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>     at javax.security.auth.Subject.doAs(Subject.java:422)
>>>     at org.apache.hadoop.security.UserGroupInformation.doAs(
>>> UserGroupInformation.java:1548)
>>>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>>>
>>>
>>> I am working with a 64bit ubuntu 14.04LTS. I've installed snappy on my
>>> os and added the coppied the libs to hadoop_home/lib/native
>>> I've also added the libs to the JRE, but it still fails as if nothing is
>>> present.
>>> I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true $GC_DEBUG_OPTS
>>> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native $HADOOP_OPTS"
>>> in my yarn xml I have
>>> <property>
>>>       <name>yarn.app.mapreduce.am.env</name>
>>> <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>>> </property>
>>>
>>> in my mapred-site.xml i have
>>> <property>
>>>         <name>mapred.child.java.opts</name>
>>>         <value> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native<
>>> /value>
>>>     </property>
>>>     <property>
>>>         <name>mapreduce.reduce.java.opts</name>
>>> <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>>     </property>
>>>
>>> The last two were a desperation move.
>>> The result is always the same. Any ideas would be welcomed.
>>>
>>> Thanks,
>>> Peter
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>

Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by Ted Yu <yu...@gmail.com>.
See:
https://issues.apache.org/jira/browse/HADOOP-9911

Can you recompile snappy for 64-bit system ?

Cheers

On Wed, Dec 10, 2014 at 9:22 AM, peterm_second <re...@gmail.com> wrote:

> Hi Fabio ,
> Thanks for the reply, but unfortunately it didn't work. I am using vanilla
> hadoop 2.4 with vanilla hive 0.14 and so on, I am using the vanilla distros.
> I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't make any
> change. What version were you using ?
>
> Peter
>
>
>
> On 10.12.2014 16:23, Fabio wrote:
>
>> Not sure it will help, but if the problem is native library loading, I
>> spent a loooong time trying anything to make it work.
>> I may suggest to try also:
>> export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
>> export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
>> I have this both in the bash "init" script (/etc/profile.p/...) and in
>> /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure it's
>> redundant, but as long as it works I don't change it.
>> I see here I commented out my attempts to set HADOOP_OPTS, so maybe it's
>> not necessary.
>> I don't see anything in my .xml config files.
>> Also, someone says to compile the libraries under your 64 bit system,
>> since the ones in Hadoop are for a 32bit architecture.
>>
>> Good luck
>>
>> Fabio
>>
>> On 12/10/2014 02:57 PM, peterm_second wrote:
>>
>>> Hi guys,
>>> I have a hadoop + hbase + hive application,
>>> For some reason my cluster is unable to find the snappy native library
>>> Here is the exception :
>>>  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>>>     at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
>>> Method)
>>>     at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>>>
>>>     at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>>>
>>>     at org.apache.hadoop.io.compress.CodecPool.getCompressor(
>>> CodecPool.java:148)
>>>     at org.apache.hadoop.io.compress.CodecPool.getCompressor(
>>> CodecPool.java:163)
>>>     at org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>>>     at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.
>>> sortAndSpill(MapTask.java:1583)
>>>     at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(
>>> MapTask.java:1462)
>>>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>     at javax.security.auth.Subject.doAs(Subject.java:422)
>>>     at org.apache.hadoop.security.UserGroupInformation.doAs(
>>> UserGroupInformation.java:1548)
>>>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>>>
>>>
>>> I am working with a 64bit ubuntu 14.04LTS. I've installed snappy on my
>>> os and added the coppied the libs to hadoop_home/lib/native
>>> I've also added the libs to the JRE, but it still fails as if nothing is
>>> present.
>>> I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true $GC_DEBUG_OPTS
>>> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native $HADOOP_OPTS"
>>> in my yarn xml I have
>>> <property>
>>>       <name>yarn.app.mapreduce.am.env</name>
>>> <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>>> </property>
>>>
>>> in my mapred-site.xml i have
>>> <property>
>>>         <name>mapred.child.java.opts</name>
>>>         <value> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native<
>>> /value>
>>>     </property>
>>>     <property>
>>>         <name>mapreduce.reduce.java.opts</name>
>>> <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>>     </property>
>>>
>>> The last two were a desperation move.
>>> The result is always the same. Any ideas would be welcomed.
>>>
>>> Thanks,
>>> Peter
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>

Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by Fabio <an...@gmail.com>.
Plain Apache Hadoop 2.5.0.
Too bad it didn't work, hope someone can help.

On 12/10/2014 06:22 PM, peterm_second wrote:
> Hi Fabio ,
> Thanks for the reply, but unfortunately it didn't work. I am using 
> vanilla hadoop 2.4 with vanilla hive 0.14 and so on, I am using the 
> vanilla distros.
> I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't make any 
> change. What version were you using ?
>
> Peter
>
>
> On 10.12.2014 16:23, Fabio wrote:
>> Not sure it will help, but if the problem is native library loading, 
>> I spent a loooong time trying anything to make it work.
>> I may suggest to try also:
>> export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
>> export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
>> I have this both in the bash "init" script (/etc/profile.p/...) and 
>> in /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure it's 
>> redundant, but as long as it works I don't change it.
>> I see here I commented out my attempts to set HADOOP_OPTS, so maybe 
>> it's not necessary.
>> I don't see anything in my .xml config files.
>> Also, someone says to compile the libraries under your 64 bit system, 
>> since the ones in Hadoop are for a 32bit architecture.
>>
>> Good luck
>>
>> Fabio
>>
>> On 12/10/2014 02:57 PM, peterm_second wrote:
>>> Hi guys,
>>> I have a hadoop + hbase + hive application,
>>> For some reason my cluster is unable to find the snappy native library
>>> Here is the exception :
>>>  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>>>     at 
>>> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native 
>>> Method)
>>>     at 
>>> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63) 
>>>
>>>     at 
>>> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132) 
>>>
>>>     at 
>>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>>>     at 
>>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>>>     at org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>>>     at 
>>> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1583) 
>>>
>>>     at 
>>> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
>>>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>     at javax.security.auth.Subject.doAs(Subject.java:422)
>>>     at 
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) 
>>>
>>>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>>>
>>>
>>> I am working with a 64bit ubuntu 14.04LTS. I've installed snappy on 
>>> my os and added the coppied the libs to hadoop_home/lib/native
>>> I've also added the libs to the JRE, but it still fails as if 
>>> nothing is present.
>>> I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true 
>>> $GC_DEBUG_OPTS 
>>> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native $HADOOP_OPTS"
>>> in my yarn xml I have
>>> <property>
>>>       <name>yarn.app.mapreduce.am.env</name>
>>> <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>>> </property>
>>>
>>> in my mapred-site.xml i have
>>> <property>
>>>         <name>mapred.child.java.opts</name>
>>>         <value> 
>>> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>>     </property>
>>>     <property>
>>>         <name>mapreduce.reduce.java.opts</name>
>>> <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>>     </property>
>>>
>>> The last two were a desperation move.
>>> The result is always the same. Any ideas would be welcomed.
>>>
>>> Thanks,
>>> Peter
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>
> .
>


Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by Fabio <an...@gmail.com>.
Plain Apache Hadoop 2.5.0.
Too bad it didn't work, hope someone can help.

On 12/10/2014 06:22 PM, peterm_second wrote:
> Hi Fabio ,
> Thanks for the reply, but unfortunately it didn't work. I am using 
> vanilla hadoop 2.4 with vanilla hive 0.14 and so on, I am using the 
> vanilla distros.
> I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't make any 
> change. What version were you using ?
>
> Peter
>
>
> On 10.12.2014 16:23, Fabio wrote:
>> Not sure it will help, but if the problem is native library loading, 
>> I spent a loooong time trying anything to make it work.
>> I may suggest to try also:
>> export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
>> export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
>> I have this both in the bash "init" script (/etc/profile.p/...) and 
>> in /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure it's 
>> redundant, but as long as it works I don't change it.
>> I see here I commented out my attempts to set HADOOP_OPTS, so maybe 
>> it's not necessary.
>> I don't see anything in my .xml config files.
>> Also, someone says to compile the libraries under your 64 bit system, 
>> since the ones in Hadoop are for a 32bit architecture.
>>
>> Good luck
>>
>> Fabio
>>
>> On 12/10/2014 02:57 PM, peterm_second wrote:
>>> Hi guys,
>>> I have a hadoop + hbase + hive application,
>>> For some reason my cluster is unable to find the snappy native library
>>> Here is the exception :
>>>  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>>>     at 
>>> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native 
>>> Method)
>>>     at 
>>> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63) 
>>>
>>>     at 
>>> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132) 
>>>
>>>     at 
>>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>>>     at 
>>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>>>     at org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>>>     at 
>>> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1583) 
>>>
>>>     at 
>>> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
>>>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>     at javax.security.auth.Subject.doAs(Subject.java:422)
>>>     at 
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) 
>>>
>>>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>>>
>>>
>>> I am working with a 64bit ubuntu 14.04LTS. I've installed snappy on 
>>> my os and added the coppied the libs to hadoop_home/lib/native
>>> I've also added the libs to the JRE, but it still fails as if 
>>> nothing is present.
>>> I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true 
>>> $GC_DEBUG_OPTS 
>>> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native $HADOOP_OPTS"
>>> in my yarn xml I have
>>> <property>
>>>       <name>yarn.app.mapreduce.am.env</name>
>>> <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>>> </property>
>>>
>>> in my mapred-site.xml i have
>>> <property>
>>>         <name>mapred.child.java.opts</name>
>>>         <value> 
>>> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>>     </property>
>>>     <property>
>>>         <name>mapreduce.reduce.java.opts</name>
>>> <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>>     </property>
>>>
>>> The last two were a desperation move.
>>> The result is always the same. Any ideas would be welcomed.
>>>
>>> Thanks,
>>> Peter
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>
> .
>


Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by Ted Yu <yu...@gmail.com>.
See:
https://issues.apache.org/jira/browse/HADOOP-9911

Can you recompile snappy for 64-bit system ?

Cheers

On Wed, Dec 10, 2014 at 9:22 AM, peterm_second <re...@gmail.com> wrote:

> Hi Fabio ,
> Thanks for the reply, but unfortunately it didn't work. I am using vanilla
> hadoop 2.4 with vanilla hive 0.14 and so on, I am using the vanilla distros.
> I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't make any
> change. What version were you using ?
>
> Peter
>
>
>
> On 10.12.2014 16:23, Fabio wrote:
>
>> Not sure it will help, but if the problem is native library loading, I
>> spent a loooong time trying anything to make it work.
>> I may suggest to try also:
>> export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
>> export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
>> I have this both in the bash "init" script (/etc/profile.p/...) and in
>> /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure it's
>> redundant, but as long as it works I don't change it.
>> I see here I commented out my attempts to set HADOOP_OPTS, so maybe it's
>> not necessary.
>> I don't see anything in my .xml config files.
>> Also, someone says to compile the libraries under your 64 bit system,
>> since the ones in Hadoop are for a 32bit architecture.
>>
>> Good luck
>>
>> Fabio
>>
>> On 12/10/2014 02:57 PM, peterm_second wrote:
>>
>>> Hi guys,
>>> I have a hadoop + hbase + hive application,
>>> For some reason my cluster is unable to find the snappy native library
>>> Here is the exception :
>>>  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>>>     at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
>>> Method)
>>>     at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63)
>>>
>>>     at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>>>
>>>     at org.apache.hadoop.io.compress.CodecPool.getCompressor(
>>> CodecPool.java:148)
>>>     at org.apache.hadoop.io.compress.CodecPool.getCompressor(
>>> CodecPool.java:163)
>>>     at org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>>>     at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.
>>> sortAndSpill(MapTask.java:1583)
>>>     at org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(
>>> MapTask.java:1462)
>>>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>     at javax.security.auth.Subject.doAs(Subject.java:422)
>>>     at org.apache.hadoop.security.UserGroupInformation.doAs(
>>> UserGroupInformation.java:1548)
>>>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>>>
>>>
>>> I am working with a 64bit ubuntu 14.04LTS. I've installed snappy on my
>>> os and added the coppied the libs to hadoop_home/lib/native
>>> I've also added the libs to the JRE, but it still fails as if nothing is
>>> present.
>>> I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true $GC_DEBUG_OPTS
>>> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native $HADOOP_OPTS"
>>> in my yarn xml I have
>>> <property>
>>>       <name>yarn.app.mapreduce.am.env</name>
>>> <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>>> </property>
>>>
>>> in my mapred-site.xml i have
>>> <property>
>>>         <name>mapred.child.java.opts</name>
>>>         <value> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native<
>>> /value>
>>>     </property>
>>>     <property>
>>>         <name>mapreduce.reduce.java.opts</name>
>>> <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>>     </property>
>>>
>>> The last two were a desperation move.
>>> The result is always the same. Any ideas would be welcomed.
>>>
>>> Thanks,
>>> Peter
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>

Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by Fabio <an...@gmail.com>.
Plain Apache Hadoop 2.5.0.
Too bad it didn't work, hope someone can help.

On 12/10/2014 06:22 PM, peterm_second wrote:
> Hi Fabio ,
> Thanks for the reply, but unfortunately it didn't work. I am using 
> vanilla hadoop 2.4 with vanilla hive 0.14 and so on, I am using the 
> vanilla distros.
> I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't make any 
> change. What version were you using ?
>
> Peter
>
>
> On 10.12.2014 16:23, Fabio wrote:
>> Not sure it will help, but if the problem is native library loading, 
>> I spent a loooong time trying anything to make it work.
>> I may suggest to try also:
>> export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
>> export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
>> I have this both in the bash "init" script (/etc/profile.p/...) and 
>> in /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure it's 
>> redundant, but as long as it works I don't change it.
>> I see here I commented out my attempts to set HADOOP_OPTS, so maybe 
>> it's not necessary.
>> I don't see anything in my .xml config files.
>> Also, someone says to compile the libraries under your 64 bit system, 
>> since the ones in Hadoop are for a 32bit architecture.
>>
>> Good luck
>>
>> Fabio
>>
>> On 12/10/2014 02:57 PM, peterm_second wrote:
>>> Hi guys,
>>> I have a hadoop + hbase + hive application,
>>> For some reason my cluster is unable to find the snappy native library
>>> Here is the exception :
>>>  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>>>     at 
>>> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native 
>>> Method)
>>>     at 
>>> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63) 
>>>
>>>     at 
>>> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132) 
>>>
>>>     at 
>>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>>>     at 
>>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>>>     at org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>>>     at 
>>> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1583) 
>>>
>>>     at 
>>> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
>>>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>     at javax.security.auth.Subject.doAs(Subject.java:422)
>>>     at 
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) 
>>>
>>>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>>>
>>>
>>> I am working with a 64bit ubuntu 14.04LTS. I've installed snappy on 
>>> my os and added the coppied the libs to hadoop_home/lib/native
>>> I've also added the libs to the JRE, but it still fails as if 
>>> nothing is present.
>>> I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true 
>>> $GC_DEBUG_OPTS 
>>> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native $HADOOP_OPTS"
>>> in my yarn xml I have
>>> <property>
>>>       <name>yarn.app.mapreduce.am.env</name>
>>> <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>>> </property>
>>>
>>> in my mapred-site.xml i have
>>> <property>
>>>         <name>mapred.child.java.opts</name>
>>>         <value> 
>>> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>>     </property>
>>>     <property>
>>>         <name>mapreduce.reduce.java.opts</name>
>>> <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>>     </property>
>>>
>>> The last two were a desperation move.
>>> The result is always the same. Any ideas would be welcomed.
>>>
>>> Thanks,
>>> Peter
>>>
>>>
>>>
>>>
>>>
>>>
>>>
>>
>
> .
>


Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by peterm_second <re...@gmail.com>.
Hi Fabio ,
Thanks for the reply, but unfortunately it didn't work. I am using 
vanilla hadoop 2.4 with vanilla hive 0.14 and so on, I am using the 
vanilla distros.
I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't make any 
change. What version were you using ?

Peter


On 10.12.2014 16:23, Fabio wrote:
> Not sure it will help, but if the problem is native library loading, I 
> spent a loooong time trying anything to make it work.
> I may suggest to try also:
> export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
> export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
> I have this both in the bash "init" script (/etc/profile.p/...) and in 
> /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure it's 
> redundant, but as long as it works I don't change it.
> I see here I commented out my attempts to set HADOOP_OPTS, so maybe 
> it's not necessary.
> I don't see anything in my .xml config files.
> Also, someone says to compile the libraries under your 64 bit system, 
> since the ones in Hadoop are for a 32bit architecture.
>
> Good luck
>
> Fabio
>
> On 12/10/2014 02:57 PM, peterm_second wrote:
>> Hi guys,
>> I have a hadoop + hbase + hive application,
>> For some reason my cluster is unable to find the snappy native library
>> Here is the exception :
>>  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>>     at 
>> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native 
>> Method)
>>     at 
>> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63) 
>>
>>     at 
>> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132) 
>>
>>     at 
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>>     at 
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>>     at org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>>     at 
>> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1583) 
>>
>>     at 
>> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
>>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at javax.security.auth.Subject.doAs(Subject.java:422)
>>     at 
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) 
>>
>>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>>
>>
>> I am working with a 64bit ubuntu 14.04LTS. I've installed snappy on 
>> my os and added the coppied the libs to hadoop_home/lib/native
>> I've also added the libs to the JRE, but it still fails as if nothing 
>> is present.
>> I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true 
>> $GC_DEBUG_OPTS -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native 
>> $HADOOP_OPTS"
>> in my yarn xml I have
>> <property>
>>       <name>yarn.app.mapreduce.am.env</name>
>> <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>> </property>
>>
>> in my mapred-site.xml i have
>> <property>
>>         <name>mapred.child.java.opts</name>
>>         <value> 
>> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>     </property>
>>     <property>
>>         <name>mapreduce.reduce.java.opts</name>
>> <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>     </property>
>>
>> The last two were a desperation move.
>> The result is always the same. Any ideas would be welcomed.
>>
>> Thanks,
>> Peter
>>
>>
>>
>>
>>
>>
>>
>


Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by peterm_second <re...@gmail.com>.
Hi Fabio ,
Thanks for the reply, but unfortunately it didn't work. I am using 
vanilla hadoop 2.4 with vanilla hive 0.14 and so on, I am using the 
vanilla distros.
I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't make any 
change. What version were you using ?

Peter


On 10.12.2014 16:23, Fabio wrote:
> Not sure it will help, but if the problem is native library loading, I 
> spent a loooong time trying anything to make it work.
> I may suggest to try also:
> export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
> export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
> I have this both in the bash "init" script (/etc/profile.p/...) and in 
> /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure it's 
> redundant, but as long as it works I don't change it.
> I see here I commented out my attempts to set HADOOP_OPTS, so maybe 
> it's not necessary.
> I don't see anything in my .xml config files.
> Also, someone says to compile the libraries under your 64 bit system, 
> since the ones in Hadoop are for a 32bit architecture.
>
> Good luck
>
> Fabio
>
> On 12/10/2014 02:57 PM, peterm_second wrote:
>> Hi guys,
>> I have a hadoop + hbase + hive application,
>> For some reason my cluster is unable to find the snappy native library
>> Here is the exception :
>>  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>>     at 
>> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native 
>> Method)
>>     at 
>> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63) 
>>
>>     at 
>> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132) 
>>
>>     at 
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>>     at 
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>>     at org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>>     at 
>> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1583) 
>>
>>     at 
>> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
>>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at javax.security.auth.Subject.doAs(Subject.java:422)
>>     at 
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) 
>>
>>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>>
>>
>> I am working with a 64bit ubuntu 14.04LTS. I've installed snappy on 
>> my os and added the coppied the libs to hadoop_home/lib/native
>> I've also added the libs to the JRE, but it still fails as if nothing 
>> is present.
>> I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true 
>> $GC_DEBUG_OPTS -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native 
>> $HADOOP_OPTS"
>> in my yarn xml I have
>> <property>
>>       <name>yarn.app.mapreduce.am.env</name>
>> <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>> </property>
>>
>> in my mapred-site.xml i have
>> <property>
>>         <name>mapred.child.java.opts</name>
>>         <value> 
>> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>     </property>
>>     <property>
>>         <name>mapreduce.reduce.java.opts</name>
>> <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>     </property>
>>
>> The last two were a desperation move.
>> The result is always the same. Any ideas would be welcomed.
>>
>> Thanks,
>> Peter
>>
>>
>>
>>
>>
>>
>>
>


Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by peterm_second <re...@gmail.com>.
Hi Fabio ,
Thanks for the reply, but unfortunately it didn't work. I am using 
vanilla hadoop 2.4 with vanilla hive 0.14 and so on, I am using the 
vanilla distros.
I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't make any 
change. What version were you using ?

Peter


On 10.12.2014 16:23, Fabio wrote:
> Not sure it will help, but if the problem is native library loading, I 
> spent a loooong time trying anything to make it work.
> I may suggest to try also:
> export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
> export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
> I have this both in the bash "init" script (/etc/profile.p/...) and in 
> /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure it's 
> redundant, but as long as it works I don't change it.
> I see here I commented out my attempts to set HADOOP_OPTS, so maybe 
> it's not necessary.
> I don't see anything in my .xml config files.
> Also, someone says to compile the libraries under your 64 bit system, 
> since the ones in Hadoop are for a 32bit architecture.
>
> Good luck
>
> Fabio
>
> On 12/10/2014 02:57 PM, peterm_second wrote:
>> Hi guys,
>> I have a hadoop + hbase + hive application,
>> For some reason my cluster is unable to find the snappy native library
>> Here is the exception :
>>  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>>     at 
>> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native 
>> Method)
>>     at 
>> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63) 
>>
>>     at 
>> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132) 
>>
>>     at 
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>>     at 
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>>     at org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>>     at 
>> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1583) 
>>
>>     at 
>> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
>>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at javax.security.auth.Subject.doAs(Subject.java:422)
>>     at 
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) 
>>
>>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>>
>>
>> I am working with a 64bit ubuntu 14.04LTS. I've installed snappy on 
>> my os and added the coppied the libs to hadoop_home/lib/native
>> I've also added the libs to the JRE, but it still fails as if nothing 
>> is present.
>> I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true 
>> $GC_DEBUG_OPTS -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native 
>> $HADOOP_OPTS"
>> in my yarn xml I have
>> <property>
>>       <name>yarn.app.mapreduce.am.env</name>
>> <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>> </property>
>>
>> in my mapred-site.xml i have
>> <property>
>>         <name>mapred.child.java.opts</name>
>>         <value> 
>> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>     </property>
>>     <property>
>>         <name>mapreduce.reduce.java.opts</name>
>> <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>     </property>
>>
>> The last two were a desperation move.
>> The result is always the same. Any ideas would be welcomed.
>>
>> Thanks,
>> Peter
>>
>>
>>
>>
>>
>>
>>
>


Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by peterm_second <re...@gmail.com>.
Hi Fabio ,
Thanks for the reply, but unfortunately it didn't work. I am using 
vanilla hadoop 2.4 with vanilla hive 0.14 and so on, I am using the 
vanilla distros.
I did set the HADOOP_COMMON_LIB_NATIVE_DIR but that didn't make any 
change. What version were you using ?

Peter


On 10.12.2014 16:23, Fabio wrote:
> Not sure it will help, but if the problem is native library loading, I 
> spent a loooong time trying anything to make it work.
> I may suggest to try also:
> export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
> export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
> I have this both in the bash "init" script (/etc/profile.p/...) and in 
> /opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure it's 
> redundant, but as long as it works I don't change it.
> I see here I commented out my attempts to set HADOOP_OPTS, so maybe 
> it's not necessary.
> I don't see anything in my .xml config files.
> Also, someone says to compile the libraries under your 64 bit system, 
> since the ones in Hadoop are for a 32bit architecture.
>
> Good luck
>
> Fabio
>
> On 12/10/2014 02:57 PM, peterm_second wrote:
>> Hi guys,
>> I have a hadoop + hbase + hive application,
>> For some reason my cluster is unable to find the snappy native library
>> Here is the exception :
>>  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>>     at 
>> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native 
>> Method)
>>     at 
>> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63) 
>>
>>     at 
>> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132) 
>>
>>     at 
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>>     at 
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>>     at org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>>     at 
>> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1583) 
>>
>>     at 
>> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
>>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>>     at java.security.AccessController.doPrivileged(Native Method)
>>     at javax.security.auth.Subject.doAs(Subject.java:422)
>>     at 
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) 
>>
>>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>>
>>
>> I am working with a 64bit ubuntu 14.04LTS. I've installed snappy on 
>> my os and added the coppied the libs to hadoop_home/lib/native
>> I've also added the libs to the JRE, but it still fails as if nothing 
>> is present.
>> I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true 
>> $GC_DEBUG_OPTS -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native 
>> $HADOOP_OPTS"
>> in my yarn xml I have
>> <property>
>>       <name>yarn.app.mapreduce.am.env</name>
>> <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
>> </property>
>>
>> in my mapred-site.xml i have
>> <property>
>>         <name>mapred.child.java.opts</name>
>>         <value> 
>> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>     </property>
>>     <property>
>>         <name>mapreduce.reduce.java.opts</name>
>> <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>>     </property>
>>
>> The last two were a desperation move.
>> The result is always the same. Any ideas would be welcomed.
>>
>> Thanks,
>> Peter
>>
>>
>>
>>
>>
>>
>>
>


Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by Fabio <an...@gmail.com>.
Not sure it will help, but if the problem is native library loading, I 
spent a loooong time trying anything to make it work.
I may suggest to try also:
export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
I have this both in the bash "init" script (/etc/profile.p/...) and in 
/opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure it's 
redundant, but as long as it works I don't change it.
I see here I commented out my attempts to set HADOOP_OPTS, so maybe it's 
not necessary.
I don't see anything in my .xml config files.
Also, someone says to compile the libraries under your 64 bit system, 
since the ones in Hadoop are for a 32bit architecture.

Good luck

Fabio

On 12/10/2014 02:57 PM, peterm_second wrote:
> Hi guys,
> I have a hadoop + hbase + hive application,
> For some reason my cluster is unable to find the snappy native library
> Here is the exception :
>  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>     at 
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native 
> Method)
>     at 
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63) 
>
>     at 
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132) 
>
>     at 
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>     at 
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>     at org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>     at 
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1583) 
>
>     at 
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:422)
>     at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) 
>
>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>
>
> I am working with a 64bit ubuntu 14.04LTS. I've installed snappy on my 
> os and added the coppied the libs to hadoop_home/lib/native
> I've also added the libs to the JRE, but it still fails as if nothing 
> is present.
> I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true 
> $GC_DEBUG_OPTS -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native 
> $HADOOP_OPTS"
> in my yarn xml I have
> <property>
>       <name>yarn.app.mapreduce.am.env</name>
> <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
> </property>
>
> in my mapred-site.xml i have
> <property>
>         <name>mapred.child.java.opts</name>
>         <value> 
> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>     </property>
>     <property>
>         <name>mapreduce.reduce.java.opts</name>
> <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>     </property>
>
> The last two were a desperation move.
> The result is always the same. Any ideas would be welcomed.
>
> Thanks,
> Peter
>
>
>
>
>
>
>


Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by Fabio <an...@gmail.com>.
Not sure it will help, but if the problem is native library loading, I 
spent a loooong time trying anything to make it work.
I may suggest to try also:
export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
I have this both in the bash "init" script (/etc/profile.p/...) and in 
/opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure it's 
redundant, but as long as it works I don't change it.
I see here I commented out my attempts to set HADOOP_OPTS, so maybe it's 
not necessary.
I don't see anything in my .xml config files.
Also, someone says to compile the libraries under your 64 bit system, 
since the ones in Hadoop are for a 32bit architecture.

Good luck

Fabio

On 12/10/2014 02:57 PM, peterm_second wrote:
> Hi guys,
> I have a hadoop + hbase + hive application,
> For some reason my cluster is unable to find the snappy native library
> Here is the exception :
>  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>     at 
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native 
> Method)
>     at 
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63) 
>
>     at 
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132) 
>
>     at 
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>     at 
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>     at org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>     at 
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1583) 
>
>     at 
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:422)
>     at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) 
>
>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>
>
> I am working with a 64bit ubuntu 14.04LTS. I've installed snappy on my 
> os and added the coppied the libs to hadoop_home/lib/native
> I've also added the libs to the JRE, but it still fails as if nothing 
> is present.
> I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true 
> $GC_DEBUG_OPTS -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native 
> $HADOOP_OPTS"
> in my yarn xml I have
> <property>
>       <name>yarn.app.mapreduce.am.env</name>
> <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
> </property>
>
> in my mapred-site.xml i have
> <property>
>         <name>mapred.child.java.opts</name>
>         <value> 
> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>     </property>
>     <property>
>         <name>mapreduce.reduce.java.opts</name>
> <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>     </property>
>
> The last two were a desperation move.
> The result is always the same. Any ideas would be welcomed.
>
> Thanks,
> Peter
>
>
>
>
>
>
>


Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by Fabio <an...@gmail.com>.
Not sure it will help, but if the problem is native library loading, I 
spent a loooong time trying anything to make it work.
I may suggest to try also:
export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
I have this both in the bash "init" script (/etc/profile.p/...) and in 
/opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure it's 
redundant, but as long as it works I don't change it.
I see here I commented out my attempts to set HADOOP_OPTS, so maybe it's 
not necessary.
I don't see anything in my .xml config files.
Also, someone says to compile the libraries under your 64 bit system, 
since the ones in Hadoop are for a 32bit architecture.

Good luck

Fabio

On 12/10/2014 02:57 PM, peterm_second wrote:
> Hi guys,
> I have a hadoop + hbase + hive application,
> For some reason my cluster is unable to find the snappy native library
> Here is the exception :
>  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>     at 
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native 
> Method)
>     at 
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63) 
>
>     at 
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132) 
>
>     at 
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>     at 
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>     at org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>     at 
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1583) 
>
>     at 
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:422)
>     at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) 
>
>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>
>
> I am working with a 64bit ubuntu 14.04LTS. I've installed snappy on my 
> os and added the coppied the libs to hadoop_home/lib/native
> I've also added the libs to the JRE, but it still fails as if nothing 
> is present.
> I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true 
> $GC_DEBUG_OPTS -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native 
> $HADOOP_OPTS"
> in my yarn xml I have
> <property>
>       <name>yarn.app.mapreduce.am.env</name>
> <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
> </property>
>
> in my mapred-site.xml i have
> <property>
>         <name>mapred.child.java.opts</name>
>         <value> 
> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>     </property>
>     <property>
>         <name>mapreduce.reduce.java.opts</name>
> <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>     </property>
>
> The last two were a desperation move.
> The result is always the same. Any ideas would be welcomed.
>
> Thanks,
> Peter
>
>
>
>
>
>
>


Re: Hadoop 2.4 + Hive 0.14 + Hbase 0.98.3 + snappy not working

Posted by Fabio <an...@gmail.com>.
Not sure it will help, but if the problem is native library loading, I 
spent a loooong time trying anything to make it work.
I may suggest to try also:
export JAVA_LIBRARY_PATH=/opt/yarn/hadoop-2.5.0/lib/native
export HADOOP_COMMON_LIB_NATIVE_DIR=/opt/yarn/hadoop-2.5.0/lib
I have this both in the bash "init" script (/etc/profile.p/...) and in 
/opt/yarn/hadoop-2.5.0/etc/hadoop/hadoop-env.sh; quite sure it's 
redundant, but as long as it works I don't change it.
I see here I commented out my attempts to set HADOOP_OPTS, so maybe it's 
not necessary.
I don't see anything in my .xml config files.
Also, someone says to compile the libraries under your 64 bit system, 
since the ones in Hadoop are for a 32bit architecture.

Good luck

Fabio

On 12/10/2014 02:57 PM, peterm_second wrote:
> Hi guys,
> I have a hadoop + hbase + hive application,
> For some reason my cluster is unable to find the snappy native library
> Here is the exception :
>  org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>     at 
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native 
> Method)
>     at 
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:63) 
>
>     at 
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132) 
>
>     at 
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>     at 
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>     at org.apache.hadoop.mapred.IFile$Writer.<init>(IFile.java:115)
>     at 
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.sortAndSpill(MapTask.java:1583) 
>
>     at 
> org.apache.hadoop.mapred.MapTask$MapOutputBuffer.flush(MapTask.java:1462)
>     at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:437)
>     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>     at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:167)
>     at java.security.AccessController.doPrivileged(Native Method)
>     at javax.security.auth.Subject.doAs(Subject.java:422)
>     at 
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) 
>
>     at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
>
>
> I am working with a 64bit ubuntu 14.04LTS. I've installed snappy on my 
> os and added the coppied the libs to hadoop_home/lib/native
> I've also added the libs to the JRE, but it still fails as if nothing 
> is present.
> I've added  HADOOP_OPTS="-Djava.net.preferIPv4Stack=true 
> $GC_DEBUG_OPTS -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native 
> $HADOOP_OPTS"
> in my yarn xml I have
> <property>
>       <name>yarn.app.mapreduce.am.env</name>
> <value>LD_LIBRARY_PATH=$HADOOP_HOME/lib/native</value>
> </property>
>
> in my mapred-site.xml i have
> <property>
>         <name>mapred.child.java.opts</name>
>         <value> 
> -Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>     </property>
>     <property>
>         <name>mapreduce.reduce.java.opts</name>
> <value>-Djava.library.path=/usr/local/hadoop-2.4.0/lib/native</value>
>     </property>
>
> The last two were a desperation move.
> The result is always the same. Any ideas would be welcomed.
>
> Thanks,
> Peter
>
>
>
>
>
>
>