You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hbase.apache.org by Jean-Marc Spaggiari <je...@spaggiari.org> on 2014/08/26 13:38:47 UTC

Re: Compilation error: HBASE 0.98.4 with Snappy

Hi Arthur,

Do you have snappy libs installed and configured? HBase doesn't come with
Snappy. So yo need to have it first.

Shameless plug:
http://www.spaggiari.org/index.php/hbase/how-to-install-snappy-with-1#.U_xxSqdZuZY

This is for 0.96 but should be very similar for 0.98. I will try it soon
and post and update, but keep us posted here so we can support you...

JM


2014-08-26 7:34 GMT-04:00 Arthur.hk.chan@gmail.com <arthur.hk.chan@gmail.com
>:

> Hi,
>
> I need to install snappy to HBase 0.98.4.  (my Hadoop version is 2.4.1)
>
> Can you please advise what would be wrong?  Should my pom.xml be incorrect
> and missing something?
>
> Regards
> Arthur
>
>
> Below are my commands:
> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 0.98.4-hadoop2
> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
> -Prelease,hadoop-snappy
>
> Iog:
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Building HBase - Server 0.98.4-hadoop2
> [INFO]
> ------------------------------------------------------------------------
> [WARNING] The POM for org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT
> is missing, no dependency information available
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Reactor Summary:
> [INFO]
> [INFO] HBase ............................................. SUCCESS [3.129s]
> [INFO] HBase - Common .................................... SUCCESS [3.105s]
> [INFO] HBase - Protocol .................................. SUCCESS [0.976s]
> [INFO] HBase - Client .................................... SUCCESS [0.925s]
> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS [0.183s]
> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS [0.497s]
> [INFO] HBase - Prefix Tree ............................... SUCCESS [0.407s]
> [INFO] HBase - Server .................................... FAILURE [0.103s]
> [INFO] HBase - Testing Util .............................. SKIPPED
> [INFO] HBase - Thrift .................................... SKIPPED
> [INFO] HBase - Shell ..................................... SKIPPED
> [INFO] HBase - Integration Tests ......................... SKIPPED
> [INFO] HBase - Examples .................................. SKIPPED
> [INFO] HBase - Assembly .................................. SKIPPED
> [INFO]
> ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Total time: 9.939s
> [INFO] Finished at: Tue Aug 26 19:23:14 HKT 2014
> [INFO] Final Memory: 61M/2921M
> [INFO]
> ------------------------------------------------------------------------
> [ERROR] Failed to execute goal on project hbase-server: Could not resolve
> dependencies for project org.apache.hbase:hbase-server:jar:0.98.4-hadoop2:
> Failure to find org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
> http://maven.oschina.net/content/groups/public/ was cached in the local
> repository, resolution will not be reattempted until the update interval of
> nexus-osc has elapsed or updates are forced -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions,
> please read the following articles:
> [ERROR] [Help 1]
> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
> [ERROR]
> [ERROR] After correcting the problems, you can resume the build with the
> command
> [ERROR]   mvn <goals> -rf :hbase-server
>
>

Re: Compilation error: HBASE 0.98.4 with Snappy

Posted by Jean-Marc Spaggiari <je...@spaggiari.org>.
Hi Arthur,

Welcome in our world ;)

For JAVA_LIBRARY_PATH I don't even set it anywhere.

hbase@node3:~/hbase-0.94.3$ echo $JAVA_LIBRARY_PATH

hbase@node3:~/hbase-0.94.3$ grep JAVA_LIBRARY_PATH conf/hbase-env.sh
base@node3:~/hbase-0.94.3$ grep JAVA_LIBRARY_PATH bin/*
bin/hbase:#   HBASE_LIBRARY_PATH  HBase additions to JAVA_LIBRARY_PATH for
adding
bin/hbase:#If avail, add Hadoop to the CLASSPATH and to the
JAVA_LIBRARY_PATH
bin/hbase:  HADOOP_JAVA_LIBRARY_PATH=$(HADOOP_CLASSPATH="$CLASSPATH"
${HADOOP_IN_PATH} \
bin/hbase:  if [ -n "$HADOOP_JAVA_LIBRARY_PATH" ]; then
bin/hbase:    JAVA_LIBRARY_PATH=$(append_path "${JAVA_LIBRARY_PATH}"
"$HADOOP_JAVA_LIBRARY_PATH")
bin/hbase:    JAVA_LIBRARY_PATH=$(append_path "$JAVA_LIBRARY_PATH"
${HBASE_HOME}/build/native/${JAVA_PLATFORM}/lib)
bin/hbase:    JAVA_LIBRARY_PATH=$(append_path "$JAVA_LIBRARY_PATH"
${HBASE_HOME}/lib/native/${JAVA_PLATFORM})
bin/hbase:  JAVA_LIBRARY_PATH=`cygpath -p "$JAVA_LIBRARY_PATH"`
bin/hbase:if [ "x$JAVA_LIBRARY_PATH" != "x" ]; then
bin/hbase:  HBASE_OPTS="$HBASE_OPTS -Djava.library.path=$JAVA_LIBRARY_PATH"
bin/hbase:  export LD_LIBRARY_PATH="$LD_LIBRARY_PATH:$JAVA_LIBRARY_PATH"

It's all the default values everywhere.

But I don't see anything wrong with what you have done.

JM


2014-08-27 7:06 GMT-04:00 Arthur.hk.chan@gmail.com <arthur.hk.chan@gmail.com
>:

> Hi JM,
>
> Thank you so much!
>
> I had not set JAVA_LIBRARY_PATH before.
> Now I added [export
> JAVA_LIBRARY_PATH="$HBASE_HOME/lib/native/Linux-amd64-64”] to hbase-env.sh
> also added [export
> JAVA_LIBRARY_PATH="$HADOOP_HOME/lib/native/Linux-amd64-64”] to hadoop-env.sh
> I hope this is correct way.
>
> Can you please share how you define JAVA_LIBRARY_PATH in your hbase-env.sh
> and hadoop-env.sh as this is new to me (I am also new to HBase :) ) ?
>
> Regards
> Arthur
>
> On 27 Aug, 2014, at 6:41 pm, Jean-Marc Spaggiari <je...@spaggiari.org>
> wrote:
>
> > Hi Arthur,
> >
> > Glad to hear you got it!
> >
> > Regarding #2, was JAVA_LIBRARY_PATH already set before? If so, that might
> > have been the issue. HBase will append to this path all what it needs (if
> > required) so I don't think there is anything else you need to add.
> >
> > Regarding #1  I don't think it's an error. Might maybe more be a warning.
> > Will look at it to see where it comes form...
> >
> > JM
> >
> >
> > 2014-08-27 4:00 GMT-04:00 Arthur.hk.chan@gmail.com <
> arthur.hk.chan@gmail.com
> >> :
> >
> >> Hi,
> >>
> >> Many thanks for your advices!
> >>
> >> Finally, I managed to make it work.
> >>
> >> I needed to add:
> >> export JAVA_LIBRARY_PATH="$HBASE_HOME/lib/native/Linux-amd64-64”
> >>
> >> then run:
> >> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> >> file:///tmp/snappy-test snappy
> >> 2014-08-27 15:51:39,459 INFO  [main] Configuration.deprecation:
> >> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> >> SLF4J: Class path contains multiple SLF4J bindings.
> >> SLF4J: Found binding in
> >>
> [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >> SLF4J: Found binding in
> >>
> [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> >> explanation.
> >> 2014-08-27 15:51:39,785 INFO  [main] util.ChecksumType: Checksum using
> >> org.apache.hadoop.util.PureJavaCrc32
> >> 2014-08-27 15:51:39,786 INFO  [main] util.ChecksumType: Checksum can use
> >> org.apache.hadoop.util.PureJavaCrc32C
> >> 2014-08-27 15:51:39,926 INFO  [main] compress.CodecPool: Got brand-new
> >> compressor [.snappy]
> >> 2014-08-27 15:51:39,930 INFO  [main] compress.CodecPool: Got brand-new
> >> compressor [.snappy]
> >> 2014-08-27 15:51:39,934 ERROR [main] hbase.KeyValue: Unexpected
> >> getShortMidpointKey result, fakeKey:testkey, firstKeyInBlock:testkey
> >> 2014-08-27 15:51:40,185 INFO  [main] compress.CodecPool: Got brand-new
> >> decompressor [.snappy]
> >> SUCCESS
> >>
> >>
> >> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> >> file:///tmp/snappy-test gz
> >> 2014-08-27 15:57:18,633 INFO  [main] Configuration.deprecation:
> >> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> >> SLF4J: Class path contains multiple SLF4J bindings.
> >> SLF4J: Found binding in
> >>
> [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >> SLF4J: Found binding in
> >>
> [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> >> explanation.
> >> 2014-08-27 15:57:18,969 INFO  [main] util.ChecksumType: Checksum using
> >> org.apache.hadoop.util.PureJavaCrc32
> >> 2014-08-27 15:57:18,970 INFO  [main] util.ChecksumType: Checksum can use
> >> org.apache.hadoop.util.PureJavaCrc32C
> >> 2014-08-27 15:57:19,127 INFO  [main] zlib.ZlibFactory: Successfully
> loaded
> >> & initialized native-zlib library
> >> 2014-08-27 15:57:19,146 INFO  [main] compress.CodecPool: Got brand-new
> >> compressor [.gz]
> >> 2014-08-27 15:57:19,149 INFO  [main] compress.CodecPool: Got brand-new
> >> compressor [.gz]
> >> 2014-08-27 15:57:19,153 ERROR [main] hbase.KeyValue: Unexpected
> >> getShortMidpointKey result, fakeKey:testkey, firstKeyInBlock:testkey
> >> 2014-08-27 15:57:19,401 INFO  [main] compress.CodecPool: Got brand-new
> >> decompressor [.gz]
> >> SUCCESS
> >>
> >>
> >> 2 questions:
> >> 1) Is this OK if “SUCCESS" with "ERROR [main] hbase.KeyValue: Unexpected
> >> getShortMidpointKey result, fakeKey:testkey, firstKeyInBlock:test key”
> >> 2) is this extra setting of “JAVA_LIBRARY_PATH” a good way for setting
> up
> >> snappy with Hadoop 2.4.1 and HBase 0.98.4?
> >>
> >>
> >> Regards
> >> Arthur
> >>
> >>
> >>
> >> On 27 Aug, 2014, at 1:13 pm, Arthur.hk.chan@gmail.com <
> >> arthur.hk.chan@gmail.com> wrote:
> >>
> >>> Hi,
> >>>
> >>> Thanks!  tried but still same error:
> >>>
> >>> rm hadoop-2.4.1-src -Rf
> >>
> >>         // delete all old src files and try again
> >>> tar -vxf hadoop-2.4.1-src.tar.gz
> >>> cd hadoop-2.4.1-src
> >>> mvn -DskipTests clean install -Drequire.snappy=true​-Pnative
> >>                                                      // compile with
> snappy
> >>> [INFO]
> >>> [INFO] Apache Hadoop Main ................................ SUCCESS
> >> [0.887s]
> >>> [INFO] Apache Hadoop Project POM ......................... SUCCESS
> >> [0.306s]
> >>> [INFO] Apache Hadoop Annotations ......................... SUCCESS
> >> [0.859s]
> >>> [INFO] Apache Hadoop Project Dist POM .................... SUCCESS
> >> [0.231s]
> >>> [INFO] Apache Hadoop Assemblies .......................... SUCCESS
> >> [0.071s]
> >>> [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS
> >> [0.960s]
> >>> [INFO] Apache Hadoop MiniKDC ............................. SUCCESS
> >> [0.711s]
> >>> [INFO] Apache Hadoop Auth ................................ SUCCESS
> >> [0.641s]
> >>> [INFO] Apache Hadoop Auth Examples ....................... SUCCESS
> >> [0.528s]
> >>> [INFO] Apache Hadoop Common .............................. SUCCESS
> >> [7.859s]
> >>> [INFO] Apache Hadoop NFS ................................. SUCCESS
> >> [0.282s]
> >>> [INFO] Apache Hadoop Common Project ...................... SUCCESS
> >> [0.013s]
> >>> [INFO] Apache Hadoop HDFS ................................ SUCCESS
> >> [14.210s]
> >>> [INFO] Apache Hadoop HttpFS .............................. SUCCESS
> >> [1.322s]
> >>> [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS
> >> [0.418s]
> >>> [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS
> >> [0.178s]
> >>> [INFO] Apache Hadoop HDFS Project ........................ SUCCESS
> >> [0.016s]
> >>> [INFO] hadoop-yarn ....................................... SUCCESS
> >> [0.014s]
> >>> [INFO] hadoop-yarn-api ................................... SUCCESS
> >> [3.012s]
> >>> [INFO] hadoop-yarn-common ................................ SUCCESS
> >> [1.173s]
> >>> [INFO] hadoop-yarn-server ................................ SUCCESS
> >> [0.029s]
> >>> [INFO] hadoop-yarn-server-common ......................... SUCCESS
> >> [0.379s]
> >>> [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS
> >> [0.612s]
> >>> [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS
> >> [0.166s]
> >>> [INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS
> >> [0.213s]
> >>> [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS
> >> [0.970s]
> >>> [INFO] hadoop-yarn-server-tests .......................... SUCCESS
> >> [0.158s]
> >>> [INFO] hadoop-yarn-client ................................ SUCCESS
> >> [0.227s]
> >>> [INFO] hadoop-yarn-applications .......................... SUCCESS
> >> [0.013s]
> >>> [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS
> >> [0.157s]
> >>> [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS
> >> [0.094s]
> >>> [INFO] hadoop-yarn-site .................................. SUCCESS
> >> [0.024s]
> >>> [INFO] hadoop-yarn-project ............................... SUCCESS
> >> [0.030s]
> >>> [INFO] hadoop-mapreduce-client ........................... SUCCESS
> >> [0.027s]
> >>> [INFO] hadoop-mapreduce-client-core ...................... SUCCESS
> >> [1.206s]
> >>> [INFO] hadoop-mapreduce-client-common .................... SUCCESS
> >> [1.140s]
> >>> [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS
> >> [0.128s]
> >>> [INFO] hadoop-mapreduce-client-app ....................... SUCCESS
> >> [0.634s]
> >>> [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS
> >> [0.557s]
> >>> [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS
> >> [0.882s]
> >>> [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS
> >> [0.085s]
> >>> [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS
> >> [0.224s]
> >>> [INFO] hadoop-mapreduce .................................. SUCCESS
> >> [0.030s]
> >>> [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS
> >> [0.200s]
> >>> [INFO] Apache Hadoop Distributed Copy .................... SUCCESS
> >> [0.656s]
> >>> [INFO] Apache Hadoop Archives ............................ SUCCESS
> >> [0.112s]
> >>> [INFO] Apache Hadoop Rumen ............................... SUCCESS
> >> [0.246s]
> >>> [INFO] Apache Hadoop Gridmix ............................. SUCCESS
> >> [0.283s]
> >>> [INFO] Apache Hadoop Data Join ........................... SUCCESS
> >> [0.111s]
> >>> [INFO] Apache Hadoop Extras .............................. SUCCESS
> >> [0.146s]
> >>> [INFO] Apache Hadoop Pipes ............................... SUCCESS
> >> [0.011s]
> >>> [INFO] Apache Hadoop OpenStack support ................... SUCCESS
> >> [0.283s]
> >>> [INFO] Apache Hadoop Client .............................. SUCCESS
> >> [0.106s]
> >>> [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS
> >> [0.038s]
> >>> [INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS
> >> [0.223s]
> >>> [INFO] Apache Hadoop Tools Dist .......................... SUCCESS
> >> [0.106s]
> >>> [INFO] Apache Hadoop Tools ............................... SUCCESS
> >> [0.010s]
> >>> [INFO] Apache Hadoop Distribution ........................ SUCCESS
> >> [0.034s]
> >>> [INFO]
> >> ------------------------------------------------------------------------
> >>> [INFO] BUILD SUCCESS
> >>> [INFO]
> >> ------------------------------------------------------------------------
> >>> [INFO] Total time: 45.478s
> >>> [INFO] Finished at: Wed Aug 27 12:10:06 HKT 2014
> >>> [INFO] Final Memory: 107M/1898M
> >>> [INFO]
> >> ------------------------------------------------------------------------
> >>> mvn package -Pdist,native -DskipTests -Dtar -Drequire.snappy=true
> >>                                                             // package
> it
> >> with snappy
> >>> [INFO]
> >> ------------------------------------------------------------------------
> >>> [INFO] Reactor Summary:
> >>> [INFO]
> >>> [INFO] Apache Hadoop Main ................................ SUCCESS
> >> [0.727s]
> >>> [INFO] Apache Hadoop Project POM ......................... SUCCESS
> >> [0.555s]
> >>> [INFO] Apache Hadoop Annotations ......................... SUCCESS
> >> [1.011s]
> >>> [INFO] Apache Hadoop Assemblies .......................... SUCCESS
> >> [0.128s]
> >>> [INFO] Apache Hadoop Project Dist POM .................... SUCCESS
> >> [1.342s]
> >>> [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS
> >> [1.251s]
> >>> [INFO] Apache Hadoop MiniKDC ............................. SUCCESS
> >> [1.007s]
> >>> [INFO] Apache Hadoop Auth ................................ SUCCESS
> >> [1.252s]
> >>> [INFO] Apache Hadoop Auth Examples ....................... SUCCESS
> >> [0.929s]
> >>> [INFO] Apache Hadoop Common .............................. SUCCESS
> >> [41.330s]
> >>> [INFO] Apache Hadoop NFS ................................. SUCCESS
> >> [1.986s]
> >>> [INFO] Apache Hadoop Common Project ...................... SUCCESS
> >> [0.015s]
> >>> [INFO] Apache Hadoop HDFS ................................ SUCCESS
> >> [1:08.367s]
> >>> [INFO] Apache Hadoop HttpFS .............................. SUCCESS
> >> [47.198s]
> >>> [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS
> >> [2.807s]
> >>> [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS
> >> [1.350s]
> >>> [INFO] Apache Hadoop HDFS Project ........................ SUCCESS
> >> [0.027s]
> >>> [INFO] hadoop-yarn ....................................... SUCCESS
> >> [0.013s]
> >>> [INFO] hadoop-yarn-api ................................... SUCCESS
> >> [36.848s]
> >>> [INFO] hadoop-yarn-common ................................ SUCCESS
> >> [12.502s]
> >>> [INFO] hadoop-yarn-server ................................ SUCCESS
> >> [0.032s]
> >>> [INFO] hadoop-yarn-server-common ......................... SUCCESS
> >> [3.688s]
> >>> [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS
> >> [8.207s]
> >>> [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS
> >> [1.048s]
> >>> [INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS
> >> [1.839s]
> >>> [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS
> >> [4.766s]
> >>> [INFO] hadoop-yarn-server-tests .......................... SUCCESS
> >> [0.247s]
> >>> [INFO] hadoop-yarn-client ................................ SUCCESS
> >> [1.735s]
> >>> [INFO] hadoop-yarn-applications .......................... SUCCESS
> >> [0.013s]
> >>> [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS
> >> [0.984s]
> >>> [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS
> >> [0.792s]
> >>> [INFO] hadoop-yarn-site .................................. SUCCESS
> >> [0.034s]
> >>> [INFO] hadoop-yarn-project ............................... SUCCESS
> >> [3.327s]
> >>> [INFO] hadoop-mapreduce-client ........................... SUCCESS
> >> [0.090s]
> >>> [INFO] hadoop-mapreduce-client-core ...................... SUCCESS
> >> [7.451s]
> >>> [INFO] hadoop-mapreduce-client-common .................... SUCCESS
> >> [7.081s]
> >>> [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS
> >> [0.972s]
> >>> [INFO] hadoop-mapreduce-client-app ....................... SUCCESS
> >> [3.085s]
> >>> [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS
> >> [3.119s]
> >>> [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS
> >> [1.934s]
> >>> [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS
> >> [0.772s]
> >>> [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS
> >> [2.162s]
> >>> [INFO] hadoop-mapreduce .................................. SUCCESS
> >> [2.622s]
> >>> [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS
> >> [1.744s]
> >>> [INFO] Apache Hadoop Distributed Copy .................... SUCCESS
> >> [4.466s]
> >>> [INFO] Apache Hadoop Archives ............................ SUCCESS
> >> [0.956s]
> >>> [INFO] Apache Hadoop Rumen ............................... SUCCESS
> >> [2.203s]
> >>> [INFO] Apache Hadoop Gridmix ............................. SUCCESS
> >> [1.509s]
> >>> [INFO] Apache Hadoop Data Join ........................... SUCCESS
> >> [0.909s]
> >>> [INFO] Apache Hadoop Extras .............................. SUCCESS
> >> [1.103s]
> >>> [INFO] Apache Hadoop Pipes ............................... SUCCESS
> >> [4.794s]
> >>> [INFO] Apache Hadoop OpenStack support ................... SUCCESS
> >> [2.111s]
> >>> [INFO] Apache Hadoop Client .............................. SUCCESS
> >> [3.919s]
> >>> [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS
> >> [0.044s]
> >>> [INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS
> >> [1.665s]
> >>> [INFO] Apache Hadoop Tools Dist .......................... SUCCESS
> >> [3.936s]
> >>> [INFO] Apache Hadoop Tools ............................... SUCCESS
> >> [0.042s]
> >>> [INFO] Apache Hadoop Distribution ........................ SUCCESS
> >> [15.208s]
> >>> [INFO]
> >> ------------------------------------------------------------------------
> >>> [INFO] BUILD SUCCESS
> >>> [INFO]
> >> ------------------------------------------------------------------------
> >>> [INFO] Total time: 5:22.529s
> >>> [INFO] Finished at: Wed Aug 27 12:17:06 HKT 2014
> >>> [INFO] Final Memory: 86M/755M
> >>> [INFO]
> >> ------------------------------------------------------------------------
> >>>
> >>> ll
> >> hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/
> >>> -rw-rw-r--. 1 hduser hadoop 1062640 Aug 27 12:12 libhadoop.a
> >>> lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 12:12 libhadoop.so ->
> >> libhadoop.so.1.0.0
> >>> -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:12 libhadoop.so.1.0.0
> >>>
> >>> (copy them to $HADOOP_HOME/lib and $HBASE_HOME/lib)
> >>> cp
> >> hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/*
> >> $HADOOP_HOME/lib/native/Linux-amd64-64/
> >>> cp
> >> hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/*
> >> $HBASE_HOME/lib/native/Linux-amd64-64/
> >>>
> >>> ll $HADOOP_HOME/lib/native/Linux-amd64-64/
> >>> total 21236
> >>> -rw-rw-r--. 1 hduser hadoop 1062640 Aug 27 12:19 libhadoop.a
> >>                                                      // new
> >>> lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 06:54 libhadoopsnappy.so ->
> >> libhadoopsnappy.so.0.0.1
> >>> lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 06:54 libhadoopsnappy.so.0
> ->
> >> libhadoopsnappy.so.0.0.1
> >>> -rwxr-xr-x. 1 hduser hadoop   54961 Aug 27 06:54
> libhadoopsnappy.so.0.0.1
> >>> -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so
> >>                                                     // new
> >>> -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so.1.0.0
> >>                                                     // new
> >>> lrwxrwxrwx. 1 hduser hadoop      55 Aug 27 06:54 libjvm.so ->
> >> /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
> >>> lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 06:54 libprotobuf-lite.so ->
> >> libprotobuf-lite.so.8.0.0
> >>> lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 06:54 libprotobuf-lite.so.8
> >> -> libprotobuf-lite.so.8.0.0
> >>> -rwxr-xr-x. 1 hduser hadoop  964689 Aug 27 06:54
> >> libprotobuf-lite.so.8.0.0
> >>> lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 06:54 libprotobuf.so ->
> >> libprotobuf.so.8.0.0
> >>> lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 06:54 libprotobuf.so.8 ->
> >> libprotobuf.so.8.0.0
> >>> -rwxr-xr-x. 1 hduser hadoop 8300050 Aug 27 06:54 libprotobuf.so.8.0.0
> >>> lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 06:54 libprotoc.so ->
> >> libprotoc.so.8.0.0
> >>> lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 06:54 libprotoc.so.8 ->
> >> libprotoc.so.8.0.0
> >>> -rwxr-xr-x. 1 hduser hadoop 9935810 Aug 27 06:54 libprotoc.so.8.0.0
> >>> lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:31 libsnappy.so ->
> >> /usr/lib64/libsnappy.so
> >>> lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:32 libsnappy.so.1 ->
> >> /usr/lib64/libsnappy.so
> >>> -rwxr-xr-x. 1 hduser hadoop  147726 Aug 27 06:54 libsnappy.so.1.2.0
> >>> drwxr-xr-x. 2 hduser hadoop    4096 Aug 27 11:15 pkgconfig
> >>>
> >>>
> >>> ll $HBASE_HOME/lib/native/Linux-amd64-64/
> >>> -rw-rw-r--. 1 hduser hadoop 1062640 Aug 27 12:19 libhadoop.a
> >>                                                      // new
> >>> -rw-rw-r--. 1 hduser hadoop 1487564 Aug 27 11:14 libhadooppipes.a
> >>> lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 07:08 libhadoopsnappy.so ->
> >> libhadoopsnappy.so.0.0.1
> >>> lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 07:08 libhadoopsnappy.so.0
> ->
> >> libhadoopsnappy.so.0.0.1
> >>> -rwxr-xr-x. 1 hduser hadoop   54961 Aug 27 07:08
> libhadoopsnappy.so.0.0.1
> >>> -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so
> >>                                                     // new
> >>> -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so.1.0.0
> >>                                                     // new
> >>> -rw-rw-r--. 1 hduser hadoop  582472 Aug 27 11:14 libhadooputils.a
> >>> -rw-rw-r--. 1 hduser hadoop  298626 Aug 27 11:14 libhdfs.a
> >>> -rwxrwxr-x. 1 hduser hadoop  200370 Aug 27 11:14 libhdfs.so
> >>> -rwxrwxr-x. 1 hduser hadoop  200370 Aug 27 11:14 libhdfs.so.0.0.0
> >>> lrwxrwxrwx. 1 hduser hadoop      55 Aug 27 07:08 libjvm.so ->
> >> /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
> >>> lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 07:08 libprotobuf-lite.so ->
> >> libprotobuf-lite.so.8.0.0
> >>> lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 07:08 libprotobuf-lite.so.8
> >> -> libprotobuf-lite.so.8.0.0
> >>> -rwxr-xr-x. 1 hduser hadoop  964689 Aug 27 07:08
> >> libprotobuf-lite.so.8.0.0
> >>> lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 07:08 libprotobuf.so ->
> >> libprotobuf.so.8.0.0
> >>> lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 07:08 libprotobuf.so.8 ->
> >> libprotobuf.so.8.0.0
> >>> -rwxr-xr-x. 1 hduser hadoop 8300050 Aug 27 07:08 libprotobuf.so.8.0.0
> >>> lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 07:08 libprotoc.so ->
> >> libprotoc.so.8.0.0
> >>> lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 07:08 libprotoc.so.8 ->
> >> libprotoc.so.8.0.0
> >>> -rwxr-xr-x. 1 hduser hadoop 9935810 Aug 27 07:08 libprotoc.so.8.0.0
> >>> lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:32 libsnappy.so ->
> >> /usr/lib64/libsnappy.so
> >>> lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:33 libsnappy.so.1 ->
> >> /usr/lib64/libsnappy.so
> >>> -rwxr-xr-x. 1 hduser hadoop  147726 Aug 27 07:08 libsnappy.so.1.2.0
> >>> drwxr-xr-x. 2 hduser hadoop    4096 Aug 27 07:08 pkgconfig
> >>>
> >>>
> >>>
> >>> sudo yum install snappy snappy-devel
> >>> Loaded plugins: fastestmirror, security
> >>> Loading mirror speeds from cached hostfile
> >>> ...
> >>> Package snappy-1.1.0-1.el6.x86_64 already installed and latest version
> >>> Package snappy-devel-1.1.0-1.el6.x86_64 already installed and latest
> >> version
> >>> Nothing to do
> >>>
> >>>
> >>> ln -sf /usr/lib64/libsnappy.so $HADOOP_HOME/lib/native/Linux-amd64-64/.
> >>> ln -sf /usr/lib64/libsnappy.so $HBASE_HOME/lib/native/Linux-amd64-64/.
> >>>
> >>> ll $HADOOP_HOME/lib/native/Linux-amd64-64/libsnappy.so
> >>> lrwxrwxrwx. 1 hduser hadoop 23 Aug 27 11:31
> >> $HADOOP_HOME/lib/native/Linux-amd64-64/libsnappy.so ->
> >> /usr/lib64/libsnappy.s
> >>> ll $HBASE_HOME/lib/native/Linux-amd64-64/libsnappy.so
> >>> lrwxrwxrwx. 1 hduser hadoop 23 Aug 27 11:32
> >> $HBASE_HOME/lib/native/Linux-amd64-64/libsnappy.so ->
> >> /usr/lib64/libsnappy.so
> >>>
> >>>
> >>>
> >>> ($HADOOP_HOME/etc/hadoop/hadoop-env.sh  added following)
> >>> ### 2014-08-27
> >>> export
> >>
> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
> >>> ###
> >>>
> >>> ($HBASE_HOME/conf/hbase-env.sh added following)
> >>> ### 2014-08-27
> >>> export
> >>
> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
> >>> export
> >>
> HBASE_LIBRARY_PATH=$HBASE_LIBRARY_PATH:$HBASE_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/:$HBASE_HOME/lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
> >>> export CLASSPATH=$CLASSPATH:$HBASE_LIBRARY_PATH
> >>> export HBASE_CLASSPATH=$HBASE_CLASSPATH:$HBASE_LIBRARY_PATH
> >>> ###
> >>>
> >>>
> >>> (restarted both HADOOP and HBASE)
> >>> jps
> >>> 26324 HRegionServer
> >>> 26137 HMaster
> >>> 25567 JobHistoryServer
> >>> 25485 NodeManager
> >>> 25913 WebAppProxyServer
> >>> 24831 DataNode
> >>> 24712 NameNode
> >>> 27146 Jps
> >>> 9219 QuorumPeerMain
> >>> 25042 JournalNode
> >>> 25239 DFSZKFailoverController
> >>> 25358 ResourceManager
> >>>
> >>>
> >>> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> >> file:///tmp/snappy-test snappy
> >>> 2014-08-27 12:24:08,030 INFO  [main] Configuration.deprecation:
> >> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> >>> SLF4J: Class path contains multiple SLF4J bindings.
> >>> SLF4J: Found binding in
> >>
> [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>> SLF4J: Found binding in
> >>
> [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> >> explanation.
> >>> 2014-08-27 12:24:08,387 INFO  [main] util.ChecksumType: Checksum using
> >> org.apache.hadoop.util.PureJavaCrc32
> >>> 2014-08-27 12:24:08,388 INFO  [main] util.ChecksumType: Checksum can
> use
> >> org.apache.hadoop.util.PureJavaCrc32C
> >>> Exception in thread "main" java.lang.RuntimeException: native snappy
> >> library not available: this version of libhadoop was built without
> snappy
> >> support.
> >>>      at
> >>
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
> >>>      at
> >>
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
> >>>      at
> >>
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
> >>>      at
> >>
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
> >>>      at
> >>
> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
> >>>      at
> >>
> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
> >>>      at
> >>
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
> >>>      at
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
> >>>      at
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
> >>>      at
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
> >>>      at
> >>
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
> >>>      at
> >>
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
> >>>      at
> >>
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
> >>>
> >>>
> >>> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> >> file:///tmp/snappy-test gz
> >>> 2014-08-27 12:35:34,485 INFO  [main] Configuration.deprecation:
> >> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> >>> SLF4J: Class path contains multiple SLF4J bindings.
> >>> SLF4J: Found binding in
> >>
> [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>> SLF4J: Found binding in
> >>
> [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> >> explanation.
> >>> 2014-08-27 12:35:35,495 INFO  [main] util.ChecksumType: Checksum using
> >> org.apache.hadoop.util.PureJavaCrc32
> >>> 2014-08-27 12:35:35,495 INFO  [main] util.ChecksumType: Checksum can
> use
> >> org.apache.hadoop.util.PureJavaCrc32C
> >>> 2014-08-27 12:35:35,822 INFO  [main] zlib.ZlibFactory: Successfully
> >> loaded & initialized native-zlib library
> >>> 2014-08-27 12:35:35,851 INFO  [main] compress.CodecPool: Got brand-new
> >> compressor [.gz]
> >>> 2014-08-27 12:35:35,855 INFO  [main] compress.CodecPool: Got brand-new
> >> compressor [.gz]
> >>> 2014-08-27 12:35:35,866 ERROR [main] hbase.KeyValue: Unexpected
> >> getShortMidpointKey result, fakeKey:testkey, firstKeyInBlock:testkey
> >>> 2014-08-27 12:35:36,636 INFO  [main] compress.CodecPool: Got brand-new
> >> decompressor [.gz]
> >>> SUCCESS
> >>>
> >>>
> >>>
> >>>
> >>>
> >>> So still get the same issue,  I feel the issue should come from the
> >> hadoop compilation but no idea where would be wrong. Please help.
> >>>
> >>>
> >>> in my /etc/hadoop/core-site.xml, I have following related to snappy:
> >>>   <property>
> >>>    <name>io.compression.codecs</name>
> >>>    <value>
> >>>      org.apache.hadoop.io.compress.GzipCodec,
> >>>      org.apache.hadoop.io.compress.DefaultCodec,
> >>>      org.apache.hadoop.io.compress.BZip2Codec,
> >>>      org.apache.hadoop.io.compress.SnappyCodec
> >>>    </value>
> >>>   </property>
> >>>
> >>> in my mapred-site.xml, I have following related to snappy:
> >>>   <property>
> >>>    <name>mapred.output.compress</name>
> >>>    <value>false</value>
> >>>    <description>Should the job outputs be compressed?</description>
> >>>   </property>
> >>>   <property>
> >>>    <name>mapred.output.compression.type</name>
> >>>    <value>RECORD</value>
> >>>    <description>If the job outputs are to compressed as SequenceFiles,
> >> how should they be compressed? Should be one of NONE, RECORD or
> >> BLOCK.</description>
> >>>   </property>
> >>>   <property>
> >>>    <name>mapred.output.compression.codec</name>
> >>>    <value>org.apache.hadoop.io.compress.SnappyCodec</value>
> >>>    <description>If the job outputs are compressed, how should they be
> >> compressed?
> >>>    </description>
> >>>   </property>
> >>>   <property>
> >>>    <name>mapred.compress.map.output</name>
> >>>    <value>true</value>
> >>>    <description>Should the outputs of the maps be compressed before
> >> being sent across the network. Uses SequenceFile
> compression.</description>
> >>>   </property>
> >>>   <property>
> >>>    <name>mapred.map.output.compression.codec</name>
> >>>    <value>org.apache.hadoop.io.compress.SnappyCodec</value>
> >>>    <description>If the map outputs are compressed, how should they be
> >> compressed?</description>
> >>>  </property>
> >>>
> >>>  <property>
> >>>   <name>mapreduce.map.output.compress</name>
> >>>   <value>true</value>
> >>>  </property>
> >>>  <property>
> >>>   <name>mapred.map.output.compress.codec</name>
> >>>   <value>org.apache.hadoop.io.compress.SnappyCodec</value>
> >>>  </property>
> >>>
> >>>
> >>> I didn’t add any snappy related property to base-site.xml
> >>>
> >>>
> >>>
> >>> Regards
> >>> Arthur
> >>>
> >>>
> >>>
> >>>
> >>> On 27 Aug, 2014, at 8:07 am, Andrew Purtell <ap...@apache.org>
> wrote:
> >>>
> >>>> On Tue, Aug 26, 2014 at 4:25 PM, Arthur.hk.chan@gmail.com <
> >>>> arthur.hk.chan@gmail.com> wrote:
> >>>>
> >>>>> Exception in thread "main" java.lang.RuntimeException: native snappy
> >>>>> library not available: this version of libhadoop was built without
> >> snappy
> >>>>> support.
> >>>>
> >>>> ​
> >>>> You are almost there. Unfortunately the native Hadoop libraries you
> >> copied
> >>>> into HBase's lib/native/Linux-amd64-64/ directory were
> >>>> ​apparently ​
> >>>> built without snappy support, as the exception indicates. You'll need
> to
> >>>> compile the native Hadoop libraries with snappy support enabled.
> Install
> >>>> snappy-revel as Alex mentioned and then build the Hadoop native
> >> libraries.
> >>>>
> >>>> 1. Get Hadoop sources for the Hadoop version
> >>>> 2. tar xvzf ....
> >>>> 3. cd /path/to/hadoop/src
> >>>> 4. mvn -DskipTests clean install
> >>>> ​ -Drequire.snappy=true​
> >>>> -Pnative
> >>>> 5. cp
> >>>>
> >>
> hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/libhadoop.*
> >>>> /path/to/hbase/lib/native/Linux-amd64-64
> >>>>
> >>>> ​(The -Drequire.snappy=true will fail the build if Snappy link
> libraries
> >>>> are not installed, so you can be sure of this.)​
> >>>>
> >>>>
> >>>> --
> >>>> Best regards,
> >>>>
> >>>>  - Andy
> >>>>
> >>>> Problems worthy of attack prove their worth by hitting back. - Piet
> Hein
> >>>> (via Tom White)
> >>>
> >>
> >>
>
>

Re: Compilation error: HBASE 0.98.4 with Snappy

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
Hi JM,

Thank you so much!

I had not set JAVA_LIBRARY_PATH before. 
Now I added [export JAVA_LIBRARY_PATH="$HBASE_HOME/lib/native/Linux-amd64-64”] to hbase-env.sh
also added [export JAVA_LIBRARY_PATH="$HADOOP_HOME/lib/native/Linux-amd64-64”] to hadoop-env.sh
I hope this is correct way.

Can you please share how you define JAVA_LIBRARY_PATH in your hbase-env.sh and hadoop-env.sh as this is new to me (I am also new to HBase :) ) ? 

Regards
Arthur

On 27 Aug, 2014, at 6:41 pm, Jean-Marc Spaggiari <je...@spaggiari.org> wrote:

> Hi Arthur,
> 
> Glad to hear you got it!
> 
> Regarding #2, was JAVA_LIBRARY_PATH already set before? If so, that might
> have been the issue. HBase will append to this path all what it needs (if
> required) so I don't think there is anything else you need to add.
> 
> Regarding #1  I don't think it's an error. Might maybe more be a warning.
> Will look at it to see where it comes form...
> 
> JM
> 
> 
> 2014-08-27 4:00 GMT-04:00 Arthur.hk.chan@gmail.com <arthur.hk.chan@gmail.com
>> :
> 
>> Hi,
>> 
>> Many thanks for your advices!
>> 
>> Finally, I managed to make it work.
>> 
>> I needed to add:
>> export JAVA_LIBRARY_PATH="$HBASE_HOME/lib/native/Linux-amd64-64”
>> 
>> then run:
>> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
>> file:///tmp/snappy-test snappy
>> 2014-08-27 15:51:39,459 INFO  [main] Configuration.deprecation:
>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>> SLF4J: Class path contains multiple SLF4J bindings.
>> SLF4J: Found binding in
>> [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in
>> [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>> explanation.
>> 2014-08-27 15:51:39,785 INFO  [main] util.ChecksumType: Checksum using
>> org.apache.hadoop.util.PureJavaCrc32
>> 2014-08-27 15:51:39,786 INFO  [main] util.ChecksumType: Checksum can use
>> org.apache.hadoop.util.PureJavaCrc32C
>> 2014-08-27 15:51:39,926 INFO  [main] compress.CodecPool: Got brand-new
>> compressor [.snappy]
>> 2014-08-27 15:51:39,930 INFO  [main] compress.CodecPool: Got brand-new
>> compressor [.snappy]
>> 2014-08-27 15:51:39,934 ERROR [main] hbase.KeyValue: Unexpected
>> getShortMidpointKey result, fakeKey:testkey, firstKeyInBlock:testkey
>> 2014-08-27 15:51:40,185 INFO  [main] compress.CodecPool: Got brand-new
>> decompressor [.snappy]
>> SUCCESS
>> 
>> 
>> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
>> file:///tmp/snappy-test gz
>> 2014-08-27 15:57:18,633 INFO  [main] Configuration.deprecation:
>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>> SLF4J: Class path contains multiple SLF4J bindings.
>> SLF4J: Found binding in
>> [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: Found binding in
>> [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>> explanation.
>> 2014-08-27 15:57:18,969 INFO  [main] util.ChecksumType: Checksum using
>> org.apache.hadoop.util.PureJavaCrc32
>> 2014-08-27 15:57:18,970 INFO  [main] util.ChecksumType: Checksum can use
>> org.apache.hadoop.util.PureJavaCrc32C
>> 2014-08-27 15:57:19,127 INFO  [main] zlib.ZlibFactory: Successfully loaded
>> & initialized native-zlib library
>> 2014-08-27 15:57:19,146 INFO  [main] compress.CodecPool: Got brand-new
>> compressor [.gz]
>> 2014-08-27 15:57:19,149 INFO  [main] compress.CodecPool: Got brand-new
>> compressor [.gz]
>> 2014-08-27 15:57:19,153 ERROR [main] hbase.KeyValue: Unexpected
>> getShortMidpointKey result, fakeKey:testkey, firstKeyInBlock:testkey
>> 2014-08-27 15:57:19,401 INFO  [main] compress.CodecPool: Got brand-new
>> decompressor [.gz]
>> SUCCESS
>> 
>> 
>> 2 questions:
>> 1) Is this OK if “SUCCESS" with "ERROR [main] hbase.KeyValue: Unexpected
>> getShortMidpointKey result, fakeKey:testkey, firstKeyInBlock:test key”
>> 2) is this extra setting of “JAVA_LIBRARY_PATH” a good way for setting up
>> snappy with Hadoop 2.4.1 and HBase 0.98.4?
>> 
>> 
>> Regards
>> Arthur
>> 
>> 
>> 
>> On 27 Aug, 2014, at 1:13 pm, Arthur.hk.chan@gmail.com <
>> arthur.hk.chan@gmail.com> wrote:
>> 
>>> Hi,
>>> 
>>> Thanks!  tried but still same error:
>>> 
>>> rm hadoop-2.4.1-src -Rf
>> 
>>         // delete all old src files and try again
>>> tar -vxf hadoop-2.4.1-src.tar.gz
>>> cd hadoop-2.4.1-src
>>> mvn -DskipTests clean install -Drequire.snappy=true​-Pnative
>>                                                      // compile with snappy
>>> [INFO]
>>> [INFO] Apache Hadoop Main ................................ SUCCESS
>> [0.887s]
>>> [INFO] Apache Hadoop Project POM ......................... SUCCESS
>> [0.306s]
>>> [INFO] Apache Hadoop Annotations ......................... SUCCESS
>> [0.859s]
>>> [INFO] Apache Hadoop Project Dist POM .................... SUCCESS
>> [0.231s]
>>> [INFO] Apache Hadoop Assemblies .......................... SUCCESS
>> [0.071s]
>>> [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS
>> [0.960s]
>>> [INFO] Apache Hadoop MiniKDC ............................. SUCCESS
>> [0.711s]
>>> [INFO] Apache Hadoop Auth ................................ SUCCESS
>> [0.641s]
>>> [INFO] Apache Hadoop Auth Examples ....................... SUCCESS
>> [0.528s]
>>> [INFO] Apache Hadoop Common .............................. SUCCESS
>> [7.859s]
>>> [INFO] Apache Hadoop NFS ................................. SUCCESS
>> [0.282s]
>>> [INFO] Apache Hadoop Common Project ...................... SUCCESS
>> [0.013s]
>>> [INFO] Apache Hadoop HDFS ................................ SUCCESS
>> [14.210s]
>>> [INFO] Apache Hadoop HttpFS .............................. SUCCESS
>> [1.322s]
>>> [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS
>> [0.418s]
>>> [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS
>> [0.178s]
>>> [INFO] Apache Hadoop HDFS Project ........................ SUCCESS
>> [0.016s]
>>> [INFO] hadoop-yarn ....................................... SUCCESS
>> [0.014s]
>>> [INFO] hadoop-yarn-api ................................... SUCCESS
>> [3.012s]
>>> [INFO] hadoop-yarn-common ................................ SUCCESS
>> [1.173s]
>>> [INFO] hadoop-yarn-server ................................ SUCCESS
>> [0.029s]
>>> [INFO] hadoop-yarn-server-common ......................... SUCCESS
>> [0.379s]
>>> [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS
>> [0.612s]
>>> [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS
>> [0.166s]
>>> [INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS
>> [0.213s]
>>> [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS
>> [0.970s]
>>> [INFO] hadoop-yarn-server-tests .......................... SUCCESS
>> [0.158s]
>>> [INFO] hadoop-yarn-client ................................ SUCCESS
>> [0.227s]
>>> [INFO] hadoop-yarn-applications .......................... SUCCESS
>> [0.013s]
>>> [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS
>> [0.157s]
>>> [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS
>> [0.094s]
>>> [INFO] hadoop-yarn-site .................................. SUCCESS
>> [0.024s]
>>> [INFO] hadoop-yarn-project ............................... SUCCESS
>> [0.030s]
>>> [INFO] hadoop-mapreduce-client ........................... SUCCESS
>> [0.027s]
>>> [INFO] hadoop-mapreduce-client-core ...................... SUCCESS
>> [1.206s]
>>> [INFO] hadoop-mapreduce-client-common .................... SUCCESS
>> [1.140s]
>>> [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS
>> [0.128s]
>>> [INFO] hadoop-mapreduce-client-app ....................... SUCCESS
>> [0.634s]
>>> [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS
>> [0.557s]
>>> [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS
>> [0.882s]
>>> [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS
>> [0.085s]
>>> [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS
>> [0.224s]
>>> [INFO] hadoop-mapreduce .................................. SUCCESS
>> [0.030s]
>>> [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS
>> [0.200s]
>>> [INFO] Apache Hadoop Distributed Copy .................... SUCCESS
>> [0.656s]
>>> [INFO] Apache Hadoop Archives ............................ SUCCESS
>> [0.112s]
>>> [INFO] Apache Hadoop Rumen ............................... SUCCESS
>> [0.246s]
>>> [INFO] Apache Hadoop Gridmix ............................. SUCCESS
>> [0.283s]
>>> [INFO] Apache Hadoop Data Join ........................... SUCCESS
>> [0.111s]
>>> [INFO] Apache Hadoop Extras .............................. SUCCESS
>> [0.146s]
>>> [INFO] Apache Hadoop Pipes ............................... SUCCESS
>> [0.011s]
>>> [INFO] Apache Hadoop OpenStack support ................... SUCCESS
>> [0.283s]
>>> [INFO] Apache Hadoop Client .............................. SUCCESS
>> [0.106s]
>>> [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS
>> [0.038s]
>>> [INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS
>> [0.223s]
>>> [INFO] Apache Hadoop Tools Dist .......................... SUCCESS
>> [0.106s]
>>> [INFO] Apache Hadoop Tools ............................... SUCCESS
>> [0.010s]
>>> [INFO] Apache Hadoop Distribution ........................ SUCCESS
>> [0.034s]
>>> [INFO]
>> ------------------------------------------------------------------------
>>> [INFO] BUILD SUCCESS
>>> [INFO]
>> ------------------------------------------------------------------------
>>> [INFO] Total time: 45.478s
>>> [INFO] Finished at: Wed Aug 27 12:10:06 HKT 2014
>>> [INFO] Final Memory: 107M/1898M
>>> [INFO]
>> ------------------------------------------------------------------------
>>> mvn package -Pdist,native -DskipTests -Dtar -Drequire.snappy=true
>>                                                             // package it
>> with snappy
>>> [INFO]
>> ------------------------------------------------------------------------
>>> [INFO] Reactor Summary:
>>> [INFO]
>>> [INFO] Apache Hadoop Main ................................ SUCCESS
>> [0.727s]
>>> [INFO] Apache Hadoop Project POM ......................... SUCCESS
>> [0.555s]
>>> [INFO] Apache Hadoop Annotations ......................... SUCCESS
>> [1.011s]
>>> [INFO] Apache Hadoop Assemblies .......................... SUCCESS
>> [0.128s]
>>> [INFO] Apache Hadoop Project Dist POM .................... SUCCESS
>> [1.342s]
>>> [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS
>> [1.251s]
>>> [INFO] Apache Hadoop MiniKDC ............................. SUCCESS
>> [1.007s]
>>> [INFO] Apache Hadoop Auth ................................ SUCCESS
>> [1.252s]
>>> [INFO] Apache Hadoop Auth Examples ....................... SUCCESS
>> [0.929s]
>>> [INFO] Apache Hadoop Common .............................. SUCCESS
>> [41.330s]
>>> [INFO] Apache Hadoop NFS ................................. SUCCESS
>> [1.986s]
>>> [INFO] Apache Hadoop Common Project ...................... SUCCESS
>> [0.015s]
>>> [INFO] Apache Hadoop HDFS ................................ SUCCESS
>> [1:08.367s]
>>> [INFO] Apache Hadoop HttpFS .............................. SUCCESS
>> [47.198s]
>>> [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS
>> [2.807s]
>>> [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS
>> [1.350s]
>>> [INFO] Apache Hadoop HDFS Project ........................ SUCCESS
>> [0.027s]
>>> [INFO] hadoop-yarn ....................................... SUCCESS
>> [0.013s]
>>> [INFO] hadoop-yarn-api ................................... SUCCESS
>> [36.848s]
>>> [INFO] hadoop-yarn-common ................................ SUCCESS
>> [12.502s]
>>> [INFO] hadoop-yarn-server ................................ SUCCESS
>> [0.032s]
>>> [INFO] hadoop-yarn-server-common ......................... SUCCESS
>> [3.688s]
>>> [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS
>> [8.207s]
>>> [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS
>> [1.048s]
>>> [INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS
>> [1.839s]
>>> [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS
>> [4.766s]
>>> [INFO] hadoop-yarn-server-tests .......................... SUCCESS
>> [0.247s]
>>> [INFO] hadoop-yarn-client ................................ SUCCESS
>> [1.735s]
>>> [INFO] hadoop-yarn-applications .......................... SUCCESS
>> [0.013s]
>>> [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS
>> [0.984s]
>>> [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS
>> [0.792s]
>>> [INFO] hadoop-yarn-site .................................. SUCCESS
>> [0.034s]
>>> [INFO] hadoop-yarn-project ............................... SUCCESS
>> [3.327s]
>>> [INFO] hadoop-mapreduce-client ........................... SUCCESS
>> [0.090s]
>>> [INFO] hadoop-mapreduce-client-core ...................... SUCCESS
>> [7.451s]
>>> [INFO] hadoop-mapreduce-client-common .................... SUCCESS
>> [7.081s]
>>> [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS
>> [0.972s]
>>> [INFO] hadoop-mapreduce-client-app ....................... SUCCESS
>> [3.085s]
>>> [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS
>> [3.119s]
>>> [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS
>> [1.934s]
>>> [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS
>> [0.772s]
>>> [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS
>> [2.162s]
>>> [INFO] hadoop-mapreduce .................................. SUCCESS
>> [2.622s]
>>> [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS
>> [1.744s]
>>> [INFO] Apache Hadoop Distributed Copy .................... SUCCESS
>> [4.466s]
>>> [INFO] Apache Hadoop Archives ............................ SUCCESS
>> [0.956s]
>>> [INFO] Apache Hadoop Rumen ............................... SUCCESS
>> [2.203s]
>>> [INFO] Apache Hadoop Gridmix ............................. SUCCESS
>> [1.509s]
>>> [INFO] Apache Hadoop Data Join ........................... SUCCESS
>> [0.909s]
>>> [INFO] Apache Hadoop Extras .............................. SUCCESS
>> [1.103s]
>>> [INFO] Apache Hadoop Pipes ............................... SUCCESS
>> [4.794s]
>>> [INFO] Apache Hadoop OpenStack support ................... SUCCESS
>> [2.111s]
>>> [INFO] Apache Hadoop Client .............................. SUCCESS
>> [3.919s]
>>> [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS
>> [0.044s]
>>> [INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS
>> [1.665s]
>>> [INFO] Apache Hadoop Tools Dist .......................... SUCCESS
>> [3.936s]
>>> [INFO] Apache Hadoop Tools ............................... SUCCESS
>> [0.042s]
>>> [INFO] Apache Hadoop Distribution ........................ SUCCESS
>> [15.208s]
>>> [INFO]
>> ------------------------------------------------------------------------
>>> [INFO] BUILD SUCCESS
>>> [INFO]
>> ------------------------------------------------------------------------
>>> [INFO] Total time: 5:22.529s
>>> [INFO] Finished at: Wed Aug 27 12:17:06 HKT 2014
>>> [INFO] Final Memory: 86M/755M
>>> [INFO]
>> ------------------------------------------------------------------------
>>> 
>>> ll
>> hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/
>>> -rw-rw-r--. 1 hduser hadoop 1062640 Aug 27 12:12 libhadoop.a
>>> lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 12:12 libhadoop.so ->
>> libhadoop.so.1.0.0
>>> -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:12 libhadoop.so.1.0.0
>>> 
>>> (copy them to $HADOOP_HOME/lib and $HBASE_HOME/lib)
>>> cp
>> hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/*
>> $HADOOP_HOME/lib/native/Linux-amd64-64/
>>> cp
>> hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/*
>> $HBASE_HOME/lib/native/Linux-amd64-64/
>>> 
>>> ll $HADOOP_HOME/lib/native/Linux-amd64-64/
>>> total 21236
>>> -rw-rw-r--. 1 hduser hadoop 1062640 Aug 27 12:19 libhadoop.a
>>                                                      // new
>>> lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 06:54 libhadoopsnappy.so ->
>> libhadoopsnappy.so.0.0.1
>>> lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 06:54 libhadoopsnappy.so.0 ->
>> libhadoopsnappy.so.0.0.1
>>> -rwxr-xr-x. 1 hduser hadoop   54961 Aug 27 06:54 libhadoopsnappy.so.0.0.1
>>> -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so
>>                                                     // new
>>> -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so.1.0.0
>>                                                     // new
>>> lrwxrwxrwx. 1 hduser hadoop      55 Aug 27 06:54 libjvm.so ->
>> /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
>>> lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 06:54 libprotobuf-lite.so ->
>> libprotobuf-lite.so.8.0.0
>>> lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 06:54 libprotobuf-lite.so.8
>> -> libprotobuf-lite.so.8.0.0
>>> -rwxr-xr-x. 1 hduser hadoop  964689 Aug 27 06:54
>> libprotobuf-lite.so.8.0.0
>>> lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 06:54 libprotobuf.so ->
>> libprotobuf.so.8.0.0
>>> lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 06:54 libprotobuf.so.8 ->
>> libprotobuf.so.8.0.0
>>> -rwxr-xr-x. 1 hduser hadoop 8300050 Aug 27 06:54 libprotobuf.so.8.0.0
>>> lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 06:54 libprotoc.so ->
>> libprotoc.so.8.0.0
>>> lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 06:54 libprotoc.so.8 ->
>> libprotoc.so.8.0.0
>>> -rwxr-xr-x. 1 hduser hadoop 9935810 Aug 27 06:54 libprotoc.so.8.0.0
>>> lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:31 libsnappy.so ->
>> /usr/lib64/libsnappy.so
>>> lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:32 libsnappy.so.1 ->
>> /usr/lib64/libsnappy.so
>>> -rwxr-xr-x. 1 hduser hadoop  147726 Aug 27 06:54 libsnappy.so.1.2.0
>>> drwxr-xr-x. 2 hduser hadoop    4096 Aug 27 11:15 pkgconfig
>>> 
>>> 
>>> ll $HBASE_HOME/lib/native/Linux-amd64-64/
>>> -rw-rw-r--. 1 hduser hadoop 1062640 Aug 27 12:19 libhadoop.a
>>                                                      // new
>>> -rw-rw-r--. 1 hduser hadoop 1487564 Aug 27 11:14 libhadooppipes.a
>>> lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 07:08 libhadoopsnappy.so ->
>> libhadoopsnappy.so.0.0.1
>>> lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 07:08 libhadoopsnappy.so.0 ->
>> libhadoopsnappy.so.0.0.1
>>> -rwxr-xr-x. 1 hduser hadoop   54961 Aug 27 07:08 libhadoopsnappy.so.0.0.1
>>> -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so
>>                                                     // new
>>> -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so.1.0.0
>>                                                     // new
>>> -rw-rw-r--. 1 hduser hadoop  582472 Aug 27 11:14 libhadooputils.a
>>> -rw-rw-r--. 1 hduser hadoop  298626 Aug 27 11:14 libhdfs.a
>>> -rwxrwxr-x. 1 hduser hadoop  200370 Aug 27 11:14 libhdfs.so
>>> -rwxrwxr-x. 1 hduser hadoop  200370 Aug 27 11:14 libhdfs.so.0.0.0
>>> lrwxrwxrwx. 1 hduser hadoop      55 Aug 27 07:08 libjvm.so ->
>> /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
>>> lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 07:08 libprotobuf-lite.so ->
>> libprotobuf-lite.so.8.0.0
>>> lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 07:08 libprotobuf-lite.so.8
>> -> libprotobuf-lite.so.8.0.0
>>> -rwxr-xr-x. 1 hduser hadoop  964689 Aug 27 07:08
>> libprotobuf-lite.so.8.0.0
>>> lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 07:08 libprotobuf.so ->
>> libprotobuf.so.8.0.0
>>> lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 07:08 libprotobuf.so.8 ->
>> libprotobuf.so.8.0.0
>>> -rwxr-xr-x. 1 hduser hadoop 8300050 Aug 27 07:08 libprotobuf.so.8.0.0
>>> lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 07:08 libprotoc.so ->
>> libprotoc.so.8.0.0
>>> lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 07:08 libprotoc.so.8 ->
>> libprotoc.so.8.0.0
>>> -rwxr-xr-x. 1 hduser hadoop 9935810 Aug 27 07:08 libprotoc.so.8.0.0
>>> lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:32 libsnappy.so ->
>> /usr/lib64/libsnappy.so
>>> lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:33 libsnappy.so.1 ->
>> /usr/lib64/libsnappy.so
>>> -rwxr-xr-x. 1 hduser hadoop  147726 Aug 27 07:08 libsnappy.so.1.2.0
>>> drwxr-xr-x. 2 hduser hadoop    4096 Aug 27 07:08 pkgconfig
>>> 
>>> 
>>> 
>>> sudo yum install snappy snappy-devel
>>> Loaded plugins: fastestmirror, security
>>> Loading mirror speeds from cached hostfile
>>> ...
>>> Package snappy-1.1.0-1.el6.x86_64 already installed and latest version
>>> Package snappy-devel-1.1.0-1.el6.x86_64 already installed and latest
>> version
>>> Nothing to do
>>> 
>>> 
>>> ln -sf /usr/lib64/libsnappy.so $HADOOP_HOME/lib/native/Linux-amd64-64/.
>>> ln -sf /usr/lib64/libsnappy.so $HBASE_HOME/lib/native/Linux-amd64-64/.
>>> 
>>> ll $HADOOP_HOME/lib/native/Linux-amd64-64/libsnappy.so
>>> lrwxrwxrwx. 1 hduser hadoop 23 Aug 27 11:31
>> $HADOOP_HOME/lib/native/Linux-amd64-64/libsnappy.so ->
>> /usr/lib64/libsnappy.s
>>> ll $HBASE_HOME/lib/native/Linux-amd64-64/libsnappy.so
>>> lrwxrwxrwx. 1 hduser hadoop 23 Aug 27 11:32
>> $HBASE_HOME/lib/native/Linux-amd64-64/libsnappy.so ->
>> /usr/lib64/libsnappy.so
>>> 
>>> 
>>> 
>>> ($HADOOP_HOME/etc/hadoop/hadoop-env.sh  added following)
>>> ### 2014-08-27
>>> export
>> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
>>> ###
>>> 
>>> ($HBASE_HOME/conf/hbase-env.sh added following)
>>> ### 2014-08-27
>>> export
>> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
>>> export
>> HBASE_LIBRARY_PATH=$HBASE_LIBRARY_PATH:$HBASE_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/:$HBASE_HOME/lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
>>> export CLASSPATH=$CLASSPATH:$HBASE_LIBRARY_PATH
>>> export HBASE_CLASSPATH=$HBASE_CLASSPATH:$HBASE_LIBRARY_PATH
>>> ###
>>> 
>>> 
>>> (restarted both HADOOP and HBASE)
>>> jps
>>> 26324 HRegionServer
>>> 26137 HMaster
>>> 25567 JobHistoryServer
>>> 25485 NodeManager
>>> 25913 WebAppProxyServer
>>> 24831 DataNode
>>> 24712 NameNode
>>> 27146 Jps
>>> 9219 QuorumPeerMain
>>> 25042 JournalNode
>>> 25239 DFSZKFailoverController
>>> 25358 ResourceManager
>>> 
>>> 
>>> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
>> file:///tmp/snappy-test snappy
>>> 2014-08-27 12:24:08,030 INFO  [main] Configuration.deprecation:
>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>>> SLF4J: Class path contains multiple SLF4J bindings.
>>> SLF4J: Found binding in
>> [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in
>> [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>> explanation.
>>> 2014-08-27 12:24:08,387 INFO  [main] util.ChecksumType: Checksum using
>> org.apache.hadoop.util.PureJavaCrc32
>>> 2014-08-27 12:24:08,388 INFO  [main] util.ChecksumType: Checksum can use
>> org.apache.hadoop.util.PureJavaCrc32C
>>> Exception in thread "main" java.lang.RuntimeException: native snappy
>> library not available: this version of libhadoop was built without snappy
>> support.
>>>      at
>> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
>>>      at
>> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>>>      at
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>>>      at
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>>>      at
>> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
>>>      at
>> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
>>>      at
>> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
>>>      at
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
>>>      at
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
>>>      at
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
>>>      at
>> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
>>>      at
>> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
>>>      at
>> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
>>> 
>>> 
>>> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
>> file:///tmp/snappy-test gz
>>> 2014-08-27 12:35:34,485 INFO  [main] Configuration.deprecation:
>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>>> SLF4J: Class path contains multiple SLF4J bindings.
>>> SLF4J: Found binding in
>> [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in
>> [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>> explanation.
>>> 2014-08-27 12:35:35,495 INFO  [main] util.ChecksumType: Checksum using
>> org.apache.hadoop.util.PureJavaCrc32
>>> 2014-08-27 12:35:35,495 INFO  [main] util.ChecksumType: Checksum can use
>> org.apache.hadoop.util.PureJavaCrc32C
>>> 2014-08-27 12:35:35,822 INFO  [main] zlib.ZlibFactory: Successfully
>> loaded & initialized native-zlib library
>>> 2014-08-27 12:35:35,851 INFO  [main] compress.CodecPool: Got brand-new
>> compressor [.gz]
>>> 2014-08-27 12:35:35,855 INFO  [main] compress.CodecPool: Got brand-new
>> compressor [.gz]
>>> 2014-08-27 12:35:35,866 ERROR [main] hbase.KeyValue: Unexpected
>> getShortMidpointKey result, fakeKey:testkey, firstKeyInBlock:testkey
>>> 2014-08-27 12:35:36,636 INFO  [main] compress.CodecPool: Got brand-new
>> decompressor [.gz]
>>> SUCCESS
>>> 
>>> 
>>> 
>>> 
>>> 
>>> So still get the same issue,  I feel the issue should come from the
>> hadoop compilation but no idea where would be wrong. Please help.
>>> 
>>> 
>>> in my /etc/hadoop/core-site.xml, I have following related to snappy:
>>>   <property>
>>>    <name>io.compression.codecs</name>
>>>    <value>
>>>      org.apache.hadoop.io.compress.GzipCodec,
>>>      org.apache.hadoop.io.compress.DefaultCodec,
>>>      org.apache.hadoop.io.compress.BZip2Codec,
>>>      org.apache.hadoop.io.compress.SnappyCodec
>>>    </value>
>>>   </property>
>>> 
>>> in my mapred-site.xml, I have following related to snappy:
>>>   <property>
>>>    <name>mapred.output.compress</name>
>>>    <value>false</value>
>>>    <description>Should the job outputs be compressed?</description>
>>>   </property>
>>>   <property>
>>>    <name>mapred.output.compression.type</name>
>>>    <value>RECORD</value>
>>>    <description>If the job outputs are to compressed as SequenceFiles,
>> how should they be compressed? Should be one of NONE, RECORD or
>> BLOCK.</description>
>>>   </property>
>>>   <property>
>>>    <name>mapred.output.compression.codec</name>
>>>    <value>org.apache.hadoop.io.compress.SnappyCodec</value>
>>>    <description>If the job outputs are compressed, how should they be
>> compressed?
>>>    </description>
>>>   </property>
>>>   <property>
>>>    <name>mapred.compress.map.output</name>
>>>    <value>true</value>
>>>    <description>Should the outputs of the maps be compressed before
>> being sent across the network. Uses SequenceFile compression.</description>
>>>   </property>
>>>   <property>
>>>    <name>mapred.map.output.compression.codec</name>
>>>    <value>org.apache.hadoop.io.compress.SnappyCodec</value>
>>>    <description>If the map outputs are compressed, how should they be
>> compressed?</description>
>>>  </property>
>>> 
>>>  <property>
>>>   <name>mapreduce.map.output.compress</name>
>>>   <value>true</value>
>>>  </property>
>>>  <property>
>>>   <name>mapred.map.output.compress.codec</name>
>>>   <value>org.apache.hadoop.io.compress.SnappyCodec</value>
>>>  </property>
>>> 
>>> 
>>> I didn’t add any snappy related property to base-site.xml
>>> 
>>> 
>>> 
>>> Regards
>>> Arthur
>>> 
>>> 
>>> 
>>> 
>>> On 27 Aug, 2014, at 8:07 am, Andrew Purtell <ap...@apache.org> wrote:
>>> 
>>>> On Tue, Aug 26, 2014 at 4:25 PM, Arthur.hk.chan@gmail.com <
>>>> arthur.hk.chan@gmail.com> wrote:
>>>> 
>>>>> Exception in thread "main" java.lang.RuntimeException: native snappy
>>>>> library not available: this version of libhadoop was built without
>> snappy
>>>>> support.
>>>> 
>>>> ​
>>>> You are almost there. Unfortunately the native Hadoop libraries you
>> copied
>>>> into HBase's lib/native/Linux-amd64-64/ directory were
>>>> ​apparently ​
>>>> built without snappy support, as the exception indicates. You'll need to
>>>> compile the native Hadoop libraries with snappy support enabled. Install
>>>> snappy-revel as Alex mentioned and then build the Hadoop native
>> libraries.
>>>> 
>>>> 1. Get Hadoop sources for the Hadoop version
>>>> 2. tar xvzf ....
>>>> 3. cd /path/to/hadoop/src
>>>> 4. mvn -DskipTests clean install
>>>> ​ -Drequire.snappy=true​
>>>> -Pnative
>>>> 5. cp
>>>> 
>> hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/libhadoop.*
>>>> /path/to/hbase/lib/native/Linux-amd64-64
>>>> 
>>>> ​(The -Drequire.snappy=true will fail the build if Snappy link libraries
>>>> are not installed, so you can be sure of this.)​
>>>> 
>>>> 
>>>> --
>>>> Best regards,
>>>> 
>>>>  - Andy
>>>> 
>>>> Problems worthy of attack prove their worth by hitting back. - Piet Hein
>>>> (via Tom White)
>>> 
>> 
>> 


Re: Compilation error: HBASE 0.98.4 with Snappy

Posted by Jean-Marc Spaggiari <je...@spaggiari.org>.
Hi Arthur,

Glad to hear you got it!

Regarding #2, was JAVA_LIBRARY_PATH already set before? If so, that might
have been the issue. HBase will append to this path all what it needs (if
required) so I don't think there is anything else you need to add.

Regarding #1  I don't think it's an error. Might maybe more be a warning.
Will look at it to see where it comes form...

JM


2014-08-27 4:00 GMT-04:00 Arthur.hk.chan@gmail.com <arthur.hk.chan@gmail.com
>:

> Hi,
>
> Many thanks for your advices!
>
> Finally, I managed to make it work.
>
> I needed to add:
> export JAVA_LIBRARY_PATH="$HBASE_HOME/lib/native/Linux-amd64-64”
>
> then run:
> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> file:///tmp/snappy-test snappy
> 2014-08-27 15:51:39,459 INFO  [main] Configuration.deprecation:
> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in
> [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in
> [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> 2014-08-27 15:51:39,785 INFO  [main] util.ChecksumType: Checksum using
> org.apache.hadoop.util.PureJavaCrc32
> 2014-08-27 15:51:39,786 INFO  [main] util.ChecksumType: Checksum can use
> org.apache.hadoop.util.PureJavaCrc32C
> 2014-08-27 15:51:39,926 INFO  [main] compress.CodecPool: Got brand-new
> compressor [.snappy]
> 2014-08-27 15:51:39,930 INFO  [main] compress.CodecPool: Got brand-new
> compressor [.snappy]
> 2014-08-27 15:51:39,934 ERROR [main] hbase.KeyValue: Unexpected
> getShortMidpointKey result, fakeKey:testkey, firstKeyInBlock:testkey
> 2014-08-27 15:51:40,185 INFO  [main] compress.CodecPool: Got brand-new
> decompressor [.snappy]
> SUCCESS
>
>
> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> file:///tmp/snappy-test gz
> 2014-08-27 15:57:18,633 INFO  [main] Configuration.deprecation:
> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in
> [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in
> [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> 2014-08-27 15:57:18,969 INFO  [main] util.ChecksumType: Checksum using
> org.apache.hadoop.util.PureJavaCrc32
> 2014-08-27 15:57:18,970 INFO  [main] util.ChecksumType: Checksum can use
> org.apache.hadoop.util.PureJavaCrc32C
> 2014-08-27 15:57:19,127 INFO  [main] zlib.ZlibFactory: Successfully loaded
> & initialized native-zlib library
> 2014-08-27 15:57:19,146 INFO  [main] compress.CodecPool: Got brand-new
> compressor [.gz]
> 2014-08-27 15:57:19,149 INFO  [main] compress.CodecPool: Got brand-new
> compressor [.gz]
> 2014-08-27 15:57:19,153 ERROR [main] hbase.KeyValue: Unexpected
> getShortMidpointKey result, fakeKey:testkey, firstKeyInBlock:testkey
> 2014-08-27 15:57:19,401 INFO  [main] compress.CodecPool: Got brand-new
> decompressor [.gz]
> SUCCESS
>
>
> 2 questions:
> 1) Is this OK if “SUCCESS" with "ERROR [main] hbase.KeyValue: Unexpected
> getShortMidpointKey result, fakeKey:testkey, firstKeyInBlock:test key”
> 2) is this extra setting of “JAVA_LIBRARY_PATH” a good way for setting up
> snappy with Hadoop 2.4.1 and HBase 0.98.4?
>
>
> Regards
> Arthur
>
>
>
> On 27 Aug, 2014, at 1:13 pm, Arthur.hk.chan@gmail.com <
> arthur.hk.chan@gmail.com> wrote:
>
> > Hi,
> >
> > Thanks!  tried but still same error:
> >
> > rm hadoop-2.4.1-src -Rf
>
>          // delete all old src files and try again
> > tar -vxf hadoop-2.4.1-src.tar.gz
> > cd hadoop-2.4.1-src
> > mvn -DskipTests clean install -Drequire.snappy=true​-Pnative
>                                                       // compile with snappy
> > [INFO]
> > [INFO] Apache Hadoop Main ................................ SUCCESS
> [0.887s]
> > [INFO] Apache Hadoop Project POM ......................... SUCCESS
> [0.306s]
> > [INFO] Apache Hadoop Annotations ......................... SUCCESS
> [0.859s]
> > [INFO] Apache Hadoop Project Dist POM .................... SUCCESS
> [0.231s]
> > [INFO] Apache Hadoop Assemblies .......................... SUCCESS
> [0.071s]
> > [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS
> [0.960s]
> > [INFO] Apache Hadoop MiniKDC ............................. SUCCESS
> [0.711s]
> > [INFO] Apache Hadoop Auth ................................ SUCCESS
> [0.641s]
> > [INFO] Apache Hadoop Auth Examples ....................... SUCCESS
> [0.528s]
> > [INFO] Apache Hadoop Common .............................. SUCCESS
> [7.859s]
> > [INFO] Apache Hadoop NFS ................................. SUCCESS
> [0.282s]
> > [INFO] Apache Hadoop Common Project ...................... SUCCESS
> [0.013s]
> > [INFO] Apache Hadoop HDFS ................................ SUCCESS
> [14.210s]
> > [INFO] Apache Hadoop HttpFS .............................. SUCCESS
> [1.322s]
> > [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS
> [0.418s]
> > [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS
> [0.178s]
> > [INFO] Apache Hadoop HDFS Project ........................ SUCCESS
> [0.016s]
> > [INFO] hadoop-yarn ....................................... SUCCESS
> [0.014s]
> > [INFO] hadoop-yarn-api ................................... SUCCESS
> [3.012s]
> > [INFO] hadoop-yarn-common ................................ SUCCESS
> [1.173s]
> > [INFO] hadoop-yarn-server ................................ SUCCESS
> [0.029s]
> > [INFO] hadoop-yarn-server-common ......................... SUCCESS
> [0.379s]
> > [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS
> [0.612s]
> > [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS
> [0.166s]
> > [INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS
> [0.213s]
> > [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS
> [0.970s]
> > [INFO] hadoop-yarn-server-tests .......................... SUCCESS
> [0.158s]
> > [INFO] hadoop-yarn-client ................................ SUCCESS
> [0.227s]
> > [INFO] hadoop-yarn-applications .......................... SUCCESS
> [0.013s]
> > [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS
> [0.157s]
> > [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS
> [0.094s]
> > [INFO] hadoop-yarn-site .................................. SUCCESS
> [0.024s]
> > [INFO] hadoop-yarn-project ............................... SUCCESS
> [0.030s]
> > [INFO] hadoop-mapreduce-client ........................... SUCCESS
> [0.027s]
> > [INFO] hadoop-mapreduce-client-core ...................... SUCCESS
> [1.206s]
> > [INFO] hadoop-mapreduce-client-common .................... SUCCESS
> [1.140s]
> > [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS
> [0.128s]
> > [INFO] hadoop-mapreduce-client-app ....................... SUCCESS
> [0.634s]
> > [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS
> [0.557s]
> > [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS
> [0.882s]
> > [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS
> [0.085s]
> > [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS
> [0.224s]
> > [INFO] hadoop-mapreduce .................................. SUCCESS
> [0.030s]
> > [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS
> [0.200s]
> > [INFO] Apache Hadoop Distributed Copy .................... SUCCESS
> [0.656s]
> > [INFO] Apache Hadoop Archives ............................ SUCCESS
> [0.112s]
> > [INFO] Apache Hadoop Rumen ............................... SUCCESS
> [0.246s]
> > [INFO] Apache Hadoop Gridmix ............................. SUCCESS
> [0.283s]
> > [INFO] Apache Hadoop Data Join ........................... SUCCESS
> [0.111s]
> > [INFO] Apache Hadoop Extras .............................. SUCCESS
> [0.146s]
> > [INFO] Apache Hadoop Pipes ............................... SUCCESS
> [0.011s]
> > [INFO] Apache Hadoop OpenStack support ................... SUCCESS
> [0.283s]
> > [INFO] Apache Hadoop Client .............................. SUCCESS
> [0.106s]
> > [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS
> [0.038s]
> > [INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS
> [0.223s]
> > [INFO] Apache Hadoop Tools Dist .......................... SUCCESS
> [0.106s]
> > [INFO] Apache Hadoop Tools ............................... SUCCESS
> [0.010s]
> > [INFO] Apache Hadoop Distribution ........................ SUCCESS
> [0.034s]
> > [INFO]
> ------------------------------------------------------------------------
> > [INFO] BUILD SUCCESS
> > [INFO]
> ------------------------------------------------------------------------
> > [INFO] Total time: 45.478s
> > [INFO] Finished at: Wed Aug 27 12:10:06 HKT 2014
> > [INFO] Final Memory: 107M/1898M
> > [INFO]
> ------------------------------------------------------------------------
> > mvn package -Pdist,native -DskipTests -Dtar -Drequire.snappy=true
>                                                              // package it
> with snappy
> > [INFO]
> ------------------------------------------------------------------------
> > [INFO] Reactor Summary:
> > [INFO]
> > [INFO] Apache Hadoop Main ................................ SUCCESS
> [0.727s]
> > [INFO] Apache Hadoop Project POM ......................... SUCCESS
> [0.555s]
> > [INFO] Apache Hadoop Annotations ......................... SUCCESS
> [1.011s]
> > [INFO] Apache Hadoop Assemblies .......................... SUCCESS
> [0.128s]
> > [INFO] Apache Hadoop Project Dist POM .................... SUCCESS
> [1.342s]
> > [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS
> [1.251s]
> > [INFO] Apache Hadoop MiniKDC ............................. SUCCESS
> [1.007s]
> > [INFO] Apache Hadoop Auth ................................ SUCCESS
> [1.252s]
> > [INFO] Apache Hadoop Auth Examples ....................... SUCCESS
> [0.929s]
> > [INFO] Apache Hadoop Common .............................. SUCCESS
> [41.330s]
> > [INFO] Apache Hadoop NFS ................................. SUCCESS
> [1.986s]
> > [INFO] Apache Hadoop Common Project ...................... SUCCESS
> [0.015s]
> > [INFO] Apache Hadoop HDFS ................................ SUCCESS
> [1:08.367s]
> > [INFO] Apache Hadoop HttpFS .............................. SUCCESS
> [47.198s]
> > [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS
> [2.807s]
> > [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS
> [1.350s]
> > [INFO] Apache Hadoop HDFS Project ........................ SUCCESS
> [0.027s]
> > [INFO] hadoop-yarn ....................................... SUCCESS
> [0.013s]
> > [INFO] hadoop-yarn-api ................................... SUCCESS
> [36.848s]
> > [INFO] hadoop-yarn-common ................................ SUCCESS
> [12.502s]
> > [INFO] hadoop-yarn-server ................................ SUCCESS
> [0.032s]
> > [INFO] hadoop-yarn-server-common ......................... SUCCESS
> [3.688s]
> > [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS
> [8.207s]
> > [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS
> [1.048s]
> > [INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS
> [1.839s]
> > [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS
> [4.766s]
> > [INFO] hadoop-yarn-server-tests .......................... SUCCESS
> [0.247s]
> > [INFO] hadoop-yarn-client ................................ SUCCESS
> [1.735s]
> > [INFO] hadoop-yarn-applications .......................... SUCCESS
> [0.013s]
> > [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS
> [0.984s]
> > [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS
> [0.792s]
> > [INFO] hadoop-yarn-site .................................. SUCCESS
> [0.034s]
> > [INFO] hadoop-yarn-project ............................... SUCCESS
> [3.327s]
> > [INFO] hadoop-mapreduce-client ........................... SUCCESS
> [0.090s]
> > [INFO] hadoop-mapreduce-client-core ...................... SUCCESS
> [7.451s]
> > [INFO] hadoop-mapreduce-client-common .................... SUCCESS
> [7.081s]
> > [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS
> [0.972s]
> > [INFO] hadoop-mapreduce-client-app ....................... SUCCESS
> [3.085s]
> > [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS
> [3.119s]
> > [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS
> [1.934s]
> > [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS
> [0.772s]
> > [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS
> [2.162s]
> > [INFO] hadoop-mapreduce .................................. SUCCESS
> [2.622s]
> > [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS
> [1.744s]
> > [INFO] Apache Hadoop Distributed Copy .................... SUCCESS
> [4.466s]
> > [INFO] Apache Hadoop Archives ............................ SUCCESS
> [0.956s]
> > [INFO] Apache Hadoop Rumen ............................... SUCCESS
> [2.203s]
> > [INFO] Apache Hadoop Gridmix ............................. SUCCESS
> [1.509s]
> > [INFO] Apache Hadoop Data Join ........................... SUCCESS
> [0.909s]
> > [INFO] Apache Hadoop Extras .............................. SUCCESS
> [1.103s]
> > [INFO] Apache Hadoop Pipes ............................... SUCCESS
> [4.794s]
> > [INFO] Apache Hadoop OpenStack support ................... SUCCESS
> [2.111s]
> > [INFO] Apache Hadoop Client .............................. SUCCESS
> [3.919s]
> > [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS
> [0.044s]
> > [INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS
> [1.665s]
> > [INFO] Apache Hadoop Tools Dist .......................... SUCCESS
> [3.936s]
> > [INFO] Apache Hadoop Tools ............................... SUCCESS
> [0.042s]
> > [INFO] Apache Hadoop Distribution ........................ SUCCESS
> [15.208s]
> > [INFO]
> ------------------------------------------------------------------------
> > [INFO] BUILD SUCCESS
> > [INFO]
> ------------------------------------------------------------------------
> > [INFO] Total time: 5:22.529s
> > [INFO] Finished at: Wed Aug 27 12:17:06 HKT 2014
> > [INFO] Final Memory: 86M/755M
> > [INFO]
> ------------------------------------------------------------------------
> >
> > ll
> hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/
> > -rw-rw-r--. 1 hduser hadoop 1062640 Aug 27 12:12 libhadoop.a
> > lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 12:12 libhadoop.so ->
> libhadoop.so.1.0.0
> > -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:12 libhadoop.so.1.0.0
> >
> > (copy them to $HADOOP_HOME/lib and $HBASE_HOME/lib)
> > cp
> hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/*
> $HADOOP_HOME/lib/native/Linux-amd64-64/
> > cp
> hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/*
> $HBASE_HOME/lib/native/Linux-amd64-64/
> >
> > ll $HADOOP_HOME/lib/native/Linux-amd64-64/
> > total 21236
> > -rw-rw-r--. 1 hduser hadoop 1062640 Aug 27 12:19 libhadoop.a
>                                                       // new
> > lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 06:54 libhadoopsnappy.so ->
> libhadoopsnappy.so.0.0.1
> > lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 06:54 libhadoopsnappy.so.0 ->
> libhadoopsnappy.so.0.0.1
> > -rwxr-xr-x. 1 hduser hadoop   54961 Aug 27 06:54 libhadoopsnappy.so.0.0.1
> > -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so
>                                                      // new
> > -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so.1.0.0
>                                                      // new
> > lrwxrwxrwx. 1 hduser hadoop      55 Aug 27 06:54 libjvm.so ->
> /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
> > lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 06:54 libprotobuf-lite.so ->
> libprotobuf-lite.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 06:54 libprotobuf-lite.so.8
> -> libprotobuf-lite.so.8.0.0
> > -rwxr-xr-x. 1 hduser hadoop  964689 Aug 27 06:54
> libprotobuf-lite.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 06:54 libprotobuf.so ->
> libprotobuf.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 06:54 libprotobuf.so.8 ->
> libprotobuf.so.8.0.0
> > -rwxr-xr-x. 1 hduser hadoop 8300050 Aug 27 06:54 libprotobuf.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 06:54 libprotoc.so ->
> libprotoc.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 06:54 libprotoc.so.8 ->
> libprotoc.so.8.0.0
> > -rwxr-xr-x. 1 hduser hadoop 9935810 Aug 27 06:54 libprotoc.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:31 libsnappy.so ->
> /usr/lib64/libsnappy.so
> > lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:32 libsnappy.so.1 ->
> /usr/lib64/libsnappy.so
> > -rwxr-xr-x. 1 hduser hadoop  147726 Aug 27 06:54 libsnappy.so.1.2.0
> > drwxr-xr-x. 2 hduser hadoop    4096 Aug 27 11:15 pkgconfig
> >
> >
> > ll $HBASE_HOME/lib/native/Linux-amd64-64/
> > -rw-rw-r--. 1 hduser hadoop 1062640 Aug 27 12:19 libhadoop.a
>                                                       // new
> > -rw-rw-r--. 1 hduser hadoop 1487564 Aug 27 11:14 libhadooppipes.a
> > lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 07:08 libhadoopsnappy.so ->
> libhadoopsnappy.so.0.0.1
> > lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 07:08 libhadoopsnappy.so.0 ->
> libhadoopsnappy.so.0.0.1
> > -rwxr-xr-x. 1 hduser hadoop   54961 Aug 27 07:08 libhadoopsnappy.so.0.0.1
> > -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so
>                                                      // new
> > -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so.1.0.0
>                                                      // new
> > -rw-rw-r--. 1 hduser hadoop  582472 Aug 27 11:14 libhadooputils.a
> > -rw-rw-r--. 1 hduser hadoop  298626 Aug 27 11:14 libhdfs.a
> > -rwxrwxr-x. 1 hduser hadoop  200370 Aug 27 11:14 libhdfs.so
> > -rwxrwxr-x. 1 hduser hadoop  200370 Aug 27 11:14 libhdfs.so.0.0.0
> > lrwxrwxrwx. 1 hduser hadoop      55 Aug 27 07:08 libjvm.so ->
> /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
> > lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 07:08 libprotobuf-lite.so ->
> libprotobuf-lite.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 07:08 libprotobuf-lite.so.8
> -> libprotobuf-lite.so.8.0.0
> > -rwxr-xr-x. 1 hduser hadoop  964689 Aug 27 07:08
> libprotobuf-lite.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 07:08 libprotobuf.so ->
> libprotobuf.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 07:08 libprotobuf.so.8 ->
> libprotobuf.so.8.0.0
> > -rwxr-xr-x. 1 hduser hadoop 8300050 Aug 27 07:08 libprotobuf.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 07:08 libprotoc.so ->
> libprotoc.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 07:08 libprotoc.so.8 ->
> libprotoc.so.8.0.0
> > -rwxr-xr-x. 1 hduser hadoop 9935810 Aug 27 07:08 libprotoc.so.8.0.0
> > lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:32 libsnappy.so ->
> /usr/lib64/libsnappy.so
> > lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:33 libsnappy.so.1 ->
> /usr/lib64/libsnappy.so
> > -rwxr-xr-x. 1 hduser hadoop  147726 Aug 27 07:08 libsnappy.so.1.2.0
> > drwxr-xr-x. 2 hduser hadoop    4096 Aug 27 07:08 pkgconfig
> >
> >
> >
> > sudo yum install snappy snappy-devel
> > Loaded plugins: fastestmirror, security
> > Loading mirror speeds from cached hostfile
> >  ...
> > Package snappy-1.1.0-1.el6.x86_64 already installed and latest version
> > Package snappy-devel-1.1.0-1.el6.x86_64 already installed and latest
> version
> > Nothing to do
> >
> >
> > ln -sf /usr/lib64/libsnappy.so $HADOOP_HOME/lib/native/Linux-amd64-64/.
> > ln -sf /usr/lib64/libsnappy.so $HBASE_HOME/lib/native/Linux-amd64-64/.
> >
> > ll $HADOOP_HOME/lib/native/Linux-amd64-64/libsnappy.so
> > lrwxrwxrwx. 1 hduser hadoop 23 Aug 27 11:31
> $HADOOP_HOME/lib/native/Linux-amd64-64/libsnappy.so ->
> /usr/lib64/libsnappy.s
> > ll $HBASE_HOME/lib/native/Linux-amd64-64/libsnappy.so
> > lrwxrwxrwx. 1 hduser hadoop 23 Aug 27 11:32
> $HBASE_HOME/lib/native/Linux-amd64-64/libsnappy.so ->
> /usr/lib64/libsnappy.so
> >
> >
> >
> > ($HADOOP_HOME/etc/hadoop/hadoop-env.sh  added following)
> > ### 2014-08-27
> > export
> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
> > ###
> >
> > ($HBASE_HOME/conf/hbase-env.sh added following)
> > ### 2014-08-27
> > export
> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
> > export
> HBASE_LIBRARY_PATH=$HBASE_LIBRARY_PATH:$HBASE_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/:$HBASE_HOME/lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
> > export CLASSPATH=$CLASSPATH:$HBASE_LIBRARY_PATH
> > export HBASE_CLASSPATH=$HBASE_CLASSPATH:$HBASE_LIBRARY_PATH
> > ###
> >
> >
> > (restarted both HADOOP and HBASE)
> > jps
> > 26324 HRegionServer
> > 26137 HMaster
> > 25567 JobHistoryServer
> > 25485 NodeManager
> > 25913 WebAppProxyServer
> > 24831 DataNode
> > 24712 NameNode
> > 27146 Jps
> > 9219 QuorumPeerMain
> > 25042 JournalNode
> > 25239 DFSZKFailoverController
> > 25358 ResourceManager
> >
> >
> > bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> file:///tmp/snappy-test snappy
> > 2014-08-27 12:24:08,030 INFO  [main] Configuration.deprecation:
> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> > SLF4J: Class path contains multiple SLF4J bindings.
> > SLF4J: Found binding in
> [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: Found binding in
> [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> > 2014-08-27 12:24:08,387 INFO  [main] util.ChecksumType: Checksum using
> org.apache.hadoop.util.PureJavaCrc32
> > 2014-08-27 12:24:08,388 INFO  [main] util.ChecksumType: Checksum can use
> org.apache.hadoop.util.PureJavaCrc32C
> > Exception in thread "main" java.lang.RuntimeException: native snappy
> library not available: this version of libhadoop was built without snappy
> support.
> >       at
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
> >       at
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
> >       at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
> >       at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
> >       at
> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
> >       at
> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
> >       at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
> >       at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
> >       at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
> >       at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
> >       at
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
> >       at
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
> >       at
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
> >
> >
> > bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> file:///tmp/snappy-test gz
> > 2014-08-27 12:35:34,485 INFO  [main] Configuration.deprecation:
> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> > SLF4J: Class path contains multiple SLF4J bindings.
> > SLF4J: Found binding in
> [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: Found binding in
> [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> > 2014-08-27 12:35:35,495 INFO  [main] util.ChecksumType: Checksum using
> org.apache.hadoop.util.PureJavaCrc32
> > 2014-08-27 12:35:35,495 INFO  [main] util.ChecksumType: Checksum can use
> org.apache.hadoop.util.PureJavaCrc32C
> > 2014-08-27 12:35:35,822 INFO  [main] zlib.ZlibFactory: Successfully
> loaded & initialized native-zlib library
> > 2014-08-27 12:35:35,851 INFO  [main] compress.CodecPool: Got brand-new
> compressor [.gz]
> > 2014-08-27 12:35:35,855 INFO  [main] compress.CodecPool: Got brand-new
> compressor [.gz]
> > 2014-08-27 12:35:35,866 ERROR [main] hbase.KeyValue: Unexpected
> getShortMidpointKey result, fakeKey:testkey, firstKeyInBlock:testkey
> > 2014-08-27 12:35:36,636 INFO  [main] compress.CodecPool: Got brand-new
> decompressor [.gz]
> > SUCCESS
> >
> >
> >
> >
> >
> > So still get the same issue,  I feel the issue should come from the
> hadoop compilation but no idea where would be wrong. Please help.
> >
> >
> > in my /etc/hadoop/core-site.xml, I have following related to snappy:
> >    <property>
> >     <name>io.compression.codecs</name>
> >     <value>
> >       org.apache.hadoop.io.compress.GzipCodec,
> >       org.apache.hadoop.io.compress.DefaultCodec,
> >       org.apache.hadoop.io.compress.BZip2Codec,
> >       org.apache.hadoop.io.compress.SnappyCodec
> >     </value>
> >    </property>
> >
> > in my mapred-site.xml, I have following related to snappy:
> >    <property>
> >     <name>mapred.output.compress</name>
> >     <value>false</value>
> >     <description>Should the job outputs be compressed?</description>
> >    </property>
> >    <property>
> >     <name>mapred.output.compression.type</name>
> >     <value>RECORD</value>
> >     <description>If the job outputs are to compressed as SequenceFiles,
> how should they be compressed? Should be one of NONE, RECORD or
> BLOCK.</description>
> >    </property>
> >    <property>
> >     <name>mapred.output.compression.codec</name>
> >     <value>org.apache.hadoop.io.compress.SnappyCodec</value>
> >     <description>If the job outputs are compressed, how should they be
> compressed?
> >     </description>
> >    </property>
> >    <property>
> >     <name>mapred.compress.map.output</name>
> >     <value>true</value>
> >     <description>Should the outputs of the maps be compressed before
> being sent across the network. Uses SequenceFile compression.</description>
> >    </property>
> >    <property>
> >     <name>mapred.map.output.compression.codec</name>
> >     <value>org.apache.hadoop.io.compress.SnappyCodec</value>
> >     <description>If the map outputs are compressed, how should they be
> compressed?</description>
> >   </property>
> >
> >   <property>
> >    <name>mapreduce.map.output.compress</name>
> >    <value>true</value>
> >   </property>
> >   <property>
> >    <name>mapred.map.output.compress.codec</name>
> >    <value>org.apache.hadoop.io.compress.SnappyCodec</value>
> >   </property>
> >
> >
> > I didn’t add any snappy related property to base-site.xml
> >
> >
> >
> > Regards
> > Arthur
> >
> >
> >
> >
> > On 27 Aug, 2014, at 8:07 am, Andrew Purtell <ap...@apache.org> wrote:
> >
> >> On Tue, Aug 26, 2014 at 4:25 PM, Arthur.hk.chan@gmail.com <
> >> arthur.hk.chan@gmail.com> wrote:
> >>
> >>> Exception in thread "main" java.lang.RuntimeException: native snappy
> >>> library not available: this version of libhadoop was built without
> snappy
> >>> support.
> >>
> >> ​
> >> You are almost there. Unfortunately the native Hadoop libraries you
> copied
> >> into HBase's lib/native/Linux-amd64-64/ directory were
> >> ​apparently ​
> >> built without snappy support, as the exception indicates. You'll need to
> >> compile the native Hadoop libraries with snappy support enabled. Install
> >> snappy-revel as Alex mentioned and then build the Hadoop native
> libraries.
> >>
> >> 1. Get Hadoop sources for the Hadoop version
> >> 2. tar xvzf ....
> >> 3. cd /path/to/hadoop/src
> >> 4. mvn -DskipTests clean install
> >> ​ -Drequire.snappy=true​
> >> -Pnative
> >> 5. cp
> >>
> hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/libhadoop.*
> >> /path/to/hbase/lib/native/Linux-amd64-64
> >>
> >> ​(The -Drequire.snappy=true will fail the build if Snappy link libraries
> >> are not installed, so you can be sure of this.)​
> >>
> >>
> >> --
> >> Best regards,
> >>
> >>   - Andy
> >>
> >> Problems worthy of attack prove their worth by hitting back. - Piet Hein
> >> (via Tom White)
> >
>
>

Re: Compilation error: HBASE 0.98.4 with Snappy

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
Hi,

Many thanks for your advices!

Finally, I managed to make it work.

I needed to add:
export JAVA_LIBRARY_PATH="$HBASE_HOME/lib/native/Linux-amd64-64”

then run:
bin/hbase org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test snappy
2014-08-27 15:51:39,459 INFO  [main] Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2014-08-27 15:51:39,785 INFO  [main] util.ChecksumType: Checksum using org.apache.hadoop.util.PureJavaCrc32
2014-08-27 15:51:39,786 INFO  [main] util.ChecksumType: Checksum can use org.apache.hadoop.util.PureJavaCrc32C
2014-08-27 15:51:39,926 INFO  [main] compress.CodecPool: Got brand-new compressor [.snappy]
2014-08-27 15:51:39,930 INFO  [main] compress.CodecPool: Got brand-new compressor [.snappy]
2014-08-27 15:51:39,934 ERROR [main] hbase.KeyValue: Unexpected getShortMidpointKey result, fakeKey:testkey, firstKeyInBlock:testkey
2014-08-27 15:51:40,185 INFO  [main] compress.CodecPool: Got brand-new decompressor [.snappy]
SUCCESS


bin/hbase org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test gz
2014-08-27 15:57:18,633 INFO  [main] Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2014-08-27 15:57:18,969 INFO  [main] util.ChecksumType: Checksum using org.apache.hadoop.util.PureJavaCrc32
2014-08-27 15:57:18,970 INFO  [main] util.ChecksumType: Checksum can use org.apache.hadoop.util.PureJavaCrc32C
2014-08-27 15:57:19,127 INFO  [main] zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
2014-08-27 15:57:19,146 INFO  [main] compress.CodecPool: Got brand-new compressor [.gz]
2014-08-27 15:57:19,149 INFO  [main] compress.CodecPool: Got brand-new compressor [.gz]
2014-08-27 15:57:19,153 ERROR [main] hbase.KeyValue: Unexpected getShortMidpointKey result, fakeKey:testkey, firstKeyInBlock:testkey
2014-08-27 15:57:19,401 INFO  [main] compress.CodecPool: Got brand-new decompressor [.gz]
SUCCESS


2 questions: 
1) Is this OK if “SUCCESS" with "ERROR [main] hbase.KeyValue: Unexpected getShortMidpointKey result, fakeKey:testkey, firstKeyInBlock:test key”
2) is this extra setting of “JAVA_LIBRARY_PATH” a good way for setting up snappy with Hadoop 2.4.1 and HBase 0.98.4?


Regards
Arthur



On 27 Aug, 2014, at 1:13 pm, Arthur.hk.chan@gmail.com <ar...@gmail.com> wrote:

> Hi,
> 
> Thanks!  tried but still same error:
> 
> rm hadoop-2.4.1-src -Rf																	// delete all old src files and try again
> tar -vxf hadoop-2.4.1-src.tar.gz					
> cd hadoop-2.4.1-src
> mvn -DskipTests clean install -Drequire.snappy=true​-Pnative									// compile with snappy
> [INFO] 
> [INFO] Apache Hadoop Main ................................ SUCCESS [0.887s]
> [INFO] Apache Hadoop Project POM ......................... SUCCESS [0.306s]
> [INFO] Apache Hadoop Annotations ......................... SUCCESS [0.859s]
> [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [0.231s]
> [INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.071s]
> [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [0.960s]
> [INFO] Apache Hadoop MiniKDC ............................. SUCCESS [0.711s]
> [INFO] Apache Hadoop Auth ................................ SUCCESS [0.641s]
> [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [0.528s]
> [INFO] Apache Hadoop Common .............................. SUCCESS [7.859s]
> [INFO] Apache Hadoop NFS ................................. SUCCESS [0.282s]
> [INFO] Apache Hadoop Common Project ...................... SUCCESS [0.013s]
> [INFO] Apache Hadoop HDFS ................................ SUCCESS [14.210s]
> [INFO] Apache Hadoop HttpFS .............................. SUCCESS [1.322s]
> [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [0.418s]
> [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [0.178s]
> [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.016s]
> [INFO] hadoop-yarn ....................................... SUCCESS [0.014s]
> [INFO] hadoop-yarn-api ................................... SUCCESS [3.012s]
> [INFO] hadoop-yarn-common ................................ SUCCESS [1.173s]
> [INFO] hadoop-yarn-server ................................ SUCCESS [0.029s]
> [INFO] hadoop-yarn-server-common ......................... SUCCESS [0.379s]
> [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [0.612s]
> [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [0.166s]
> [INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [0.213s]
> [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [0.970s]
> [INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.158s]
> [INFO] hadoop-yarn-client ................................ SUCCESS [0.227s]
> [INFO] hadoop-yarn-applications .......................... SUCCESS [0.013s]
> [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [0.157s]
> [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [0.094s]
> [INFO] hadoop-yarn-site .................................. SUCCESS [0.024s]
> [INFO] hadoop-yarn-project ............................... SUCCESS [0.030s]
> [INFO] hadoop-mapreduce-client ........................... SUCCESS [0.027s]
> [INFO] hadoop-mapreduce-client-core ...................... SUCCESS [1.206s]
> [INFO] hadoop-mapreduce-client-common .................... SUCCESS [1.140s]
> [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [0.128s]
> [INFO] hadoop-mapreduce-client-app ....................... SUCCESS [0.634s]
> [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [0.557s]
> [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [0.882s]
> [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [0.085s]
> [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [0.224s]
> [INFO] hadoop-mapreduce .................................. SUCCESS [0.030s]
> [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [0.200s]
> [INFO] Apache Hadoop Distributed Copy .................... SUCCESS [0.656s]
> [INFO] Apache Hadoop Archives ............................ SUCCESS [0.112s]
> [INFO] Apache Hadoop Rumen ............................... SUCCESS [0.246s]
> [INFO] Apache Hadoop Gridmix ............................. SUCCESS [0.283s]
> [INFO] Apache Hadoop Data Join ........................... SUCCESS [0.111s]
> [INFO] Apache Hadoop Extras .............................. SUCCESS [0.146s]
> [INFO] Apache Hadoop Pipes ............................... SUCCESS [0.011s]
> [INFO] Apache Hadoop OpenStack support ................... SUCCESS [0.283s]
> [INFO] Apache Hadoop Client .............................. SUCCESS [0.106s]
> [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.038s]
> [INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [0.223s]
> [INFO] Apache Hadoop Tools Dist .......................... SUCCESS [0.106s]
> [INFO] Apache Hadoop Tools ............................... SUCCESS [0.010s]
> [INFO] Apache Hadoop Distribution ........................ SUCCESS [0.034s]
> [INFO] ------------------------------------------------------------------------
> [INFO] BUILD SUCCESS
> [INFO] ------------------------------------------------------------------------
> [INFO] Total time: 45.478s
> [INFO] Finished at: Wed Aug 27 12:10:06 HKT 2014
> [INFO] Final Memory: 107M/1898M
> [INFO] ------------------------------------------------------------------------
> mvn package -Pdist,native -DskipTests -Dtar -Drequire.snappy=true									// package it with snappy
> [INFO] ------------------------------------------------------------------------
> [INFO] Reactor Summary:
> [INFO] 
> [INFO] Apache Hadoop Main ................................ SUCCESS [0.727s]
> [INFO] Apache Hadoop Project POM ......................... SUCCESS [0.555s]
> [INFO] Apache Hadoop Annotations ......................... SUCCESS [1.011s]
> [INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.128s]
> [INFO] Apache Hadoop Project Dist POM .................... SUCCESS [1.342s]
> [INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [1.251s]
> [INFO] Apache Hadoop MiniKDC ............................. SUCCESS [1.007s]
> [INFO] Apache Hadoop Auth ................................ SUCCESS [1.252s]
> [INFO] Apache Hadoop Auth Examples ....................... SUCCESS [0.929s]
> [INFO] Apache Hadoop Common .............................. SUCCESS [41.330s]
> [INFO] Apache Hadoop NFS ................................. SUCCESS [1.986s]
> [INFO] Apache Hadoop Common Project ...................... SUCCESS [0.015s]
> [INFO] Apache Hadoop HDFS ................................ SUCCESS [1:08.367s]
> [INFO] Apache Hadoop HttpFS .............................. SUCCESS [47.198s]
> [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [2.807s]
> [INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [1.350s]
> [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.027s]
> [INFO] hadoop-yarn ....................................... SUCCESS [0.013s]
> [INFO] hadoop-yarn-api ................................... SUCCESS [36.848s]
> [INFO] hadoop-yarn-common ................................ SUCCESS [12.502s]
> [INFO] hadoop-yarn-server ................................ SUCCESS [0.032s]
> [INFO] hadoop-yarn-server-common ......................... SUCCESS [3.688s]
> [INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [8.207s]
> [INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [1.048s]
> [INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [1.839s]
> [INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [4.766s]
> [INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.247s]
> [INFO] hadoop-yarn-client ................................ SUCCESS [1.735s]
> [INFO] hadoop-yarn-applications .......................... SUCCESS [0.013s]
> [INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [0.984s]
> [INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [0.792s]
> [INFO] hadoop-yarn-site .................................. SUCCESS [0.034s]
> [INFO] hadoop-yarn-project ............................... SUCCESS [3.327s]
> [INFO] hadoop-mapreduce-client ........................... SUCCESS [0.090s]
> [INFO] hadoop-mapreduce-client-core ...................... SUCCESS [7.451s]
> [INFO] hadoop-mapreduce-client-common .................... SUCCESS [7.081s]
> [INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [0.972s]
> [INFO] hadoop-mapreduce-client-app ....................... SUCCESS [3.085s]
> [INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [3.119s]
> [INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [1.934s]
> [INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [0.772s]
> [INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [2.162s]
> [INFO] hadoop-mapreduce .................................. SUCCESS [2.622s]
> [INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [1.744s]
> [INFO] Apache Hadoop Distributed Copy .................... SUCCESS [4.466s]
> [INFO] Apache Hadoop Archives ............................ SUCCESS [0.956s]
> [INFO] Apache Hadoop Rumen ............................... SUCCESS [2.203s]
> [INFO] Apache Hadoop Gridmix ............................. SUCCESS [1.509s]
> [INFO] Apache Hadoop Data Join ........................... SUCCESS [0.909s]
> [INFO] Apache Hadoop Extras .............................. SUCCESS [1.103s]
> [INFO] Apache Hadoop Pipes ............................... SUCCESS [4.794s]
> [INFO] Apache Hadoop OpenStack support ................... SUCCESS [2.111s]
> [INFO] Apache Hadoop Client .............................. SUCCESS [3.919s]
> [INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.044s]
> [INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [1.665s]
> [INFO] Apache Hadoop Tools Dist .......................... SUCCESS [3.936s]
> [INFO] Apache Hadoop Tools ............................... SUCCESS [0.042s]
> [INFO] Apache Hadoop Distribution ........................ SUCCESS [15.208s]
> [INFO] ------------------------------------------------------------------------
> [INFO] BUILD SUCCESS
> [INFO] ------------------------------------------------------------------------
> [INFO] Total time: 5:22.529s
> [INFO] Finished at: Wed Aug 27 12:17:06 HKT 2014
> [INFO] Final Memory: 86M/755M
> [INFO] ------------------------------------------------------------------------
> 
> ll hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/
> -rw-rw-r--. 1 hduser hadoop 1062640 Aug 27 12:12 libhadoop.a
> lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 12:12 libhadoop.so -> libhadoop.so.1.0.0
> -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:12 libhadoop.so.1.0.0
> 
> (copy them to $HADOOP_HOME/lib and $HBASE_HOME/lib)
> cp hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/* $HADOOP_HOME/lib/native/Linux-amd64-64/
> cp hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/* $HBASE_HOME/lib/native/Linux-amd64-64/
> 
> ll $HADOOP_HOME/lib/native/Linux-amd64-64/
> total 21236
> -rw-rw-r--. 1 hduser hadoop 1062640 Aug 27 12:19 libhadoop.a									// new
> lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 06:54 libhadoopsnappy.so -> libhadoopsnappy.so.0.0.1
> lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 06:54 libhadoopsnappy.so.0 -> libhadoopsnappy.so.0.0.1
> -rwxr-xr-x. 1 hduser hadoop   54961 Aug 27 06:54 libhadoopsnappy.so.0.0.1
> -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so									// new
> -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so.1.0.0								// new
> lrwxrwxrwx. 1 hduser hadoop      55 Aug 27 06:54 libjvm.so -> /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
> lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 06:54 libprotobuf-lite.so -> libprotobuf-lite.so.8.0.0
> lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 06:54 libprotobuf-lite.so.8 -> libprotobuf-lite.so.8.0.0
> -rwxr-xr-x. 1 hduser hadoop  964689 Aug 27 06:54 libprotobuf-lite.so.8.0.0
> lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 06:54 libprotobuf.so -> libprotobuf.so.8.0.0
> lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 06:54 libprotobuf.so.8 -> libprotobuf.so.8.0.0
> -rwxr-xr-x. 1 hduser hadoop 8300050 Aug 27 06:54 libprotobuf.so.8.0.0
> lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 06:54 libprotoc.so -> libprotoc.so.8.0.0
> lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 06:54 libprotoc.so.8 -> libprotoc.so.8.0.0
> -rwxr-xr-x. 1 hduser hadoop 9935810 Aug 27 06:54 libprotoc.so.8.0.0
> lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:31 libsnappy.so -> /usr/lib64/libsnappy.so
> lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:32 libsnappy.so.1 -> /usr/lib64/libsnappy.so
> -rwxr-xr-x. 1 hduser hadoop  147726 Aug 27 06:54 libsnappy.so.1.2.0
> drwxr-xr-x. 2 hduser hadoop    4096 Aug 27 11:15 pkgconfig
> 
> 
> ll $HBASE_HOME/lib/native/Linux-amd64-64/
> -rw-rw-r--. 1 hduser hadoop 1062640 Aug 27 12:19 libhadoop.a									// new
> -rw-rw-r--. 1 hduser hadoop 1487564 Aug 27 11:14 libhadooppipes.a
> lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 07:08 libhadoopsnappy.so -> libhadoopsnappy.so.0.0.1
> lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 07:08 libhadoopsnappy.so.0 -> libhadoopsnappy.so.0.0.1
> -rwxr-xr-x. 1 hduser hadoop   54961 Aug 27 07:08 libhadoopsnappy.so.0.0.1
> -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so									// new
> -rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so.1.0.0								// new
> -rw-rw-r--. 1 hduser hadoop  582472 Aug 27 11:14 libhadooputils.a
> -rw-rw-r--. 1 hduser hadoop  298626 Aug 27 11:14 libhdfs.a
> -rwxrwxr-x. 1 hduser hadoop  200370 Aug 27 11:14 libhdfs.so
> -rwxrwxr-x. 1 hduser hadoop  200370 Aug 27 11:14 libhdfs.so.0.0.0
> lrwxrwxrwx. 1 hduser hadoop      55 Aug 27 07:08 libjvm.so -> /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
> lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 07:08 libprotobuf-lite.so -> libprotobuf-lite.so.8.0.0
> lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 07:08 libprotobuf-lite.so.8 -> libprotobuf-lite.so.8.0.0
> -rwxr-xr-x. 1 hduser hadoop  964689 Aug 27 07:08 libprotobuf-lite.so.8.0.0
> lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 07:08 libprotobuf.so -> libprotobuf.so.8.0.0
> lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 07:08 libprotobuf.so.8 -> libprotobuf.so.8.0.0
> -rwxr-xr-x. 1 hduser hadoop 8300050 Aug 27 07:08 libprotobuf.so.8.0.0
> lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 07:08 libprotoc.so -> libprotoc.so.8.0.0
> lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 07:08 libprotoc.so.8 -> libprotoc.so.8.0.0
> -rwxr-xr-x. 1 hduser hadoop 9935810 Aug 27 07:08 libprotoc.so.8.0.0
> lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:32 libsnappy.so -> /usr/lib64/libsnappy.so
> lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:33 libsnappy.so.1 -> /usr/lib64/libsnappy.so
> -rwxr-xr-x. 1 hduser hadoop  147726 Aug 27 07:08 libsnappy.so.1.2.0
> drwxr-xr-x. 2 hduser hadoop    4096 Aug 27 07:08 pkgconfig
> 
> 
> 
> sudo yum install snappy snappy-devel
> Loaded plugins: fastestmirror, security
> Loading mirror speeds from cached hostfile
>  ...
> Package snappy-1.1.0-1.el6.x86_64 already installed and latest version
> Package snappy-devel-1.1.0-1.el6.x86_64 already installed and latest version
> Nothing to do
> 
> 
> ln -sf /usr/lib64/libsnappy.so $HADOOP_HOME/lib/native/Linux-amd64-64/.
> ln -sf /usr/lib64/libsnappy.so $HBASE_HOME/lib/native/Linux-amd64-64/.
> 
> ll $HADOOP_HOME/lib/native/Linux-amd64-64/libsnappy.so
> lrwxrwxrwx. 1 hduser hadoop 23 Aug 27 11:31 $HADOOP_HOME/lib/native/Linux-amd64-64/libsnappy.so -> /usr/lib64/libsnappy.s
> ll $HBASE_HOME/lib/native/Linux-amd64-64/libsnappy.so
> lrwxrwxrwx. 1 hduser hadoop 23 Aug 27 11:32 $HBASE_HOME/lib/native/Linux-amd64-64/libsnappy.so -> /usr/lib64/libsnappy.so
> 
> 
> 
> ($HADOOP_HOME/etc/hadoop/hadoop-env.sh  added following)
> ### 2014-08-27
> export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
> ###
> 
> ($HBASE_HOME/conf/hbase-env.sh added following)
> ### 2014-08-27
> export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
> export HBASE_LIBRARY_PATH=$HBASE_LIBRARY_PATH:$HBASE_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/:$HBASE_HOME/lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
> export CLASSPATH=$CLASSPATH:$HBASE_LIBRARY_PATH
> export HBASE_CLASSPATH=$HBASE_CLASSPATH:$HBASE_LIBRARY_PATH
> ###
> 
> 
> (restarted both HADOOP and HBASE)
> jps
> 26324 HRegionServer
> 26137 HMaster
> 25567 JobHistoryServer
> 25485 NodeManager
> 25913 WebAppProxyServer
> 24831 DataNode
> 24712 NameNode
> 27146 Jps
> 9219 QuorumPeerMain
> 25042 JournalNode
> 25239 DFSZKFailoverController
> 25358 ResourceManager
> 
> 
> bin/hbase org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test snappy
> 2014-08-27 12:24:08,030 INFO  [main] Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
> 2014-08-27 12:24:08,387 INFO  [main] util.ChecksumType: Checksum using org.apache.hadoop.util.PureJavaCrc32
> 2014-08-27 12:24:08,388 INFO  [main] util.ChecksumType: Checksum can use org.apache.hadoop.util.PureJavaCrc32C
> Exception in thread "main" java.lang.RuntimeException: native snappy library not available: this version of libhadoop was built without snappy support.
> 	at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
> 	at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
> 	at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
> 	at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
> 	at org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
> 	at org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
> 	at org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
> 	at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
> 	at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
> 	at org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
> 	at org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
> 	at org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
> 	at org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
> 
> 
> bin/hbase org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test gz
> 2014-08-27 12:35:34,485 INFO  [main] Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
> 2014-08-27 12:35:35,495 INFO  [main] util.ChecksumType: Checksum using org.apache.hadoop.util.PureJavaCrc32
> 2014-08-27 12:35:35,495 INFO  [main] util.ChecksumType: Checksum can use org.apache.hadoop.util.PureJavaCrc32C
> 2014-08-27 12:35:35,822 INFO  [main] zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
> 2014-08-27 12:35:35,851 INFO  [main] compress.CodecPool: Got brand-new compressor [.gz]
> 2014-08-27 12:35:35,855 INFO  [main] compress.CodecPool: Got brand-new compressor [.gz]
> 2014-08-27 12:35:35,866 ERROR [main] hbase.KeyValue: Unexpected getShortMidpointKey result, fakeKey:testkey, firstKeyInBlock:testkey
> 2014-08-27 12:35:36,636 INFO  [main] compress.CodecPool: Got brand-new decompressor [.gz]
> SUCCESS
> 
> 
> 
> 
> 
> So still get the same issue,  I feel the issue should come from the hadoop compilation but no idea where would be wrong. Please help.
> 
> 
> in my /etc/hadoop/core-site.xml, I have following related to snappy:
>    <property>
>     <name>io.compression.codecs</name>
>     <value>
>       org.apache.hadoop.io.compress.GzipCodec,
>       org.apache.hadoop.io.compress.DefaultCodec,
>       org.apache.hadoop.io.compress.BZip2Codec,
>       org.apache.hadoop.io.compress.SnappyCodec
>     </value>
>    </property>
> 
> in my mapred-site.xml, I have following related to snappy:
>    <property>
>     <name>mapred.output.compress</name>
>     <value>false</value>
>     <description>Should the job outputs be compressed?</description>
>    </property>
>    <property>
>     <name>mapred.output.compression.type</name>
>     <value>RECORD</value>
>     <description>If the job outputs are to compressed as SequenceFiles, how should they be compressed? Should be one of NONE, RECORD or BLOCK.</description>
>    </property>
>    <property>
>     <name>mapred.output.compression.codec</name>
>     <value>org.apache.hadoop.io.compress.SnappyCodec</value>
>     <description>If the job outputs are compressed, how should they be compressed?
>     </description>
>    </property>
>    <property>
>     <name>mapred.compress.map.output</name>
>     <value>true</value>
>     <description>Should the outputs of the maps be compressed before being sent across the network. Uses SequenceFile compression.</description>
>    </property>
>    <property>
>     <name>mapred.map.output.compression.codec</name>
>     <value>org.apache.hadoop.io.compress.SnappyCodec</value>
>     <description>If the map outputs are compressed, how should they be compressed?</description>
>   </property>
> 
>   <property>
>    <name>mapreduce.map.output.compress</name>  
>    <value>true</value>
>   </property>
>   <property>
>    <name>mapred.map.output.compress.codec</name>  
>    <value>org.apache.hadoop.io.compress.SnappyCodec</value>
>   </property>
> 
> 
> I didn’t add any snappy related property to base-site.xml
> 
> 
> 
> Regards
> Arthur
> 
> 
> 
> 
> On 27 Aug, 2014, at 8:07 am, Andrew Purtell <ap...@apache.org> wrote:
> 
>> On Tue, Aug 26, 2014 at 4:25 PM, Arthur.hk.chan@gmail.com <
>> arthur.hk.chan@gmail.com> wrote:
>> 
>>> Exception in thread "main" java.lang.RuntimeException: native snappy
>>> library not available: this version of libhadoop was built without snappy
>>> support.
>> 
>> ​
>> You are almost there. Unfortunately the native Hadoop libraries you copied
>> into HBase's lib/native/Linux-amd64-64/ directory were
>> ​apparently ​
>> built without snappy support, as the exception indicates. You'll need to
>> compile the native Hadoop libraries with snappy support enabled. Install
>> snappy-revel as Alex mentioned and then build the Hadoop native libraries.
>> 
>> 1. Get Hadoop sources for the Hadoop version
>> 2. tar xvzf ....
>> 3. cd /path/to/hadoop/src
>> 4. mvn -DskipTests clean install
>> ​ -Drequire.snappy=true​
>> -Pnative
>> 5. cp
>> hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/libhadoop.*
>> /path/to/hbase/lib/native/Linux-amd64-64
>> 
>> ​(The -Drequire.snappy=true will fail the build if Snappy link libraries
>> are not installed, so you can be sure of this.)​
>> 
>> 
>> -- 
>> Best regards,
>> 
>>   - Andy
>> 
>> Problems worthy of attack prove their worth by hitting back. - Piet Hein
>> (via Tom White)
> 


Re: Compilation error: HBASE 0.98.4 with Snappy

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
Hi,

Thanks!  tried but still same error:

rm hadoop-2.4.1-src -Rf																	// delete all old src files and try again
tar -vxf hadoop-2.4.1-src.tar.gz					
cd hadoop-2.4.1-src
mvn -DskipTests clean install -Drequire.snappy=true​-Pnative									// compile with snappy
[INFO] 
[INFO] Apache Hadoop Main ................................ SUCCESS [0.887s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [0.306s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [0.859s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [0.231s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.071s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [0.960s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [0.711s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [0.641s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [0.528s]
[INFO] Apache Hadoop Common .............................. SUCCESS [7.859s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [0.282s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.013s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [14.210s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [1.322s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [0.418s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [0.178s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.016s]
[INFO] hadoop-yarn ....................................... SUCCESS [0.014s]
[INFO] hadoop-yarn-api ................................... SUCCESS [3.012s]
[INFO] hadoop-yarn-common ................................ SUCCESS [1.173s]
[INFO] hadoop-yarn-server ................................ SUCCESS [0.029s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [0.379s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [0.612s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [0.166s]
[INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [0.213s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [0.970s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.158s]
[INFO] hadoop-yarn-client ................................ SUCCESS [0.227s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.013s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [0.157s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [0.094s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.024s]
[INFO] hadoop-yarn-project ............................... SUCCESS [0.030s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.027s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [1.206s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [1.140s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [0.128s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [0.634s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [0.557s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [0.882s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [0.085s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [0.224s]
[INFO] hadoop-mapreduce .................................. SUCCESS [0.030s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [0.200s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [0.656s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [0.112s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [0.246s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [0.283s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [0.111s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [0.146s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [0.011s]
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [0.283s]
[INFO] Apache Hadoop Client .............................. SUCCESS [0.106s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.038s]
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [0.223s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [0.106s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.010s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [0.034s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 45.478s
[INFO] Finished at: Wed Aug 27 12:10:06 HKT 2014
[INFO] Final Memory: 107M/1898M
[INFO] ------------------------------------------------------------------------
mvn package -Pdist,native -DskipTests -Dtar -Drequire.snappy=true									// package it with snappy
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Main ................................ SUCCESS [0.727s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [0.555s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [1.011s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.128s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [1.342s]
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [1.251s]
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [1.007s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [1.252s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [0.929s]
[INFO] Apache Hadoop Common .............................. SUCCESS [41.330s]
[INFO] Apache Hadoop NFS ................................. SUCCESS [1.986s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.015s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [1:08.367s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [47.198s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [2.807s]
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [1.350s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.027s]
[INFO] hadoop-yarn ....................................... SUCCESS [0.013s]
[INFO] hadoop-yarn-api ................................... SUCCESS [36.848s]
[INFO] hadoop-yarn-common ................................ SUCCESS [12.502s]
[INFO] hadoop-yarn-server ................................ SUCCESS [0.032s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [3.688s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [8.207s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [1.048s]
[INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [1.839s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [4.766s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.247s]
[INFO] hadoop-yarn-client ................................ SUCCESS [1.735s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.013s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [0.984s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [0.792s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.034s]
[INFO] hadoop-yarn-project ............................... SUCCESS [3.327s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.090s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [7.451s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [7.081s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [0.972s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [3.085s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [3.119s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [1.934s]
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [0.772s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [2.162s]
[INFO] hadoop-mapreduce .................................. SUCCESS [2.622s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [1.744s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [4.466s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [0.956s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [2.203s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [1.509s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [0.909s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [1.103s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [4.794s]
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [2.111s]
[INFO] Apache Hadoop Client .............................. SUCCESS [3.919s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.044s]
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [1.665s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [3.936s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.042s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [15.208s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 5:22.529s
[INFO] Finished at: Wed Aug 27 12:17:06 HKT 2014
[INFO] Final Memory: 86M/755M
[INFO] ------------------------------------------------------------------------

ll hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/
-rw-rw-r--. 1 hduser hadoop 1062640 Aug 27 12:12 libhadoop.a
lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 12:12 libhadoop.so -> libhadoop.so.1.0.0
-rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:12 libhadoop.so.1.0.0

(copy them to $HADOOP_HOME/lib and $HBASE_HOME/lib)
cp hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/* $HADOOP_HOME/lib/native/Linux-amd64-64/
cp hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/* $HBASE_HOME/lib/native/Linux-amd64-64/

ll $HADOOP_HOME/lib/native/Linux-amd64-64/
total 21236
-rw-rw-r--. 1 hduser hadoop 1062640 Aug 27 12:19 libhadoop.a									// new
lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 06:54 libhadoopsnappy.so -> libhadoopsnappy.so.0.0.1
lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 06:54 libhadoopsnappy.so.0 -> libhadoopsnappy.so.0.0.1
-rwxr-xr-x. 1 hduser hadoop   54961 Aug 27 06:54 libhadoopsnappy.so.0.0.1
-rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so									// new
-rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so.1.0.0								// new
lrwxrwxrwx. 1 hduser hadoop      55 Aug 27 06:54 libjvm.so -> /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 06:54 libprotobuf-lite.so -> libprotobuf-lite.so.8.0.0
lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 06:54 libprotobuf-lite.so.8 -> libprotobuf-lite.so.8.0.0
-rwxr-xr-x. 1 hduser hadoop  964689 Aug 27 06:54 libprotobuf-lite.so.8.0.0
lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 06:54 libprotobuf.so -> libprotobuf.so.8.0.0
lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 06:54 libprotobuf.so.8 -> libprotobuf.so.8.0.0
-rwxr-xr-x. 1 hduser hadoop 8300050 Aug 27 06:54 libprotobuf.so.8.0.0
lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 06:54 libprotoc.so -> libprotoc.so.8.0.0
lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 06:54 libprotoc.so.8 -> libprotoc.so.8.0.0
-rwxr-xr-x. 1 hduser hadoop 9935810 Aug 27 06:54 libprotoc.so.8.0.0
lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:31 libsnappy.so -> /usr/lib64/libsnappy.so
lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:32 libsnappy.so.1 -> /usr/lib64/libsnappy.so
-rwxr-xr-x. 1 hduser hadoop  147726 Aug 27 06:54 libsnappy.so.1.2.0
drwxr-xr-x. 2 hduser hadoop    4096 Aug 27 11:15 pkgconfig


ll $HBASE_HOME/lib/native/Linux-amd64-64/
-rw-rw-r--. 1 hduser hadoop 1062640 Aug 27 12:19 libhadoop.a									// new
-rw-rw-r--. 1 hduser hadoop 1487564 Aug 27 11:14 libhadooppipes.a
lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 07:08 libhadoopsnappy.so -> libhadoopsnappy.so.0.0.1
lrwxrwxrwx. 1 hduser hadoop      24 Aug 27 07:08 libhadoopsnappy.so.0 -> libhadoopsnappy.so.0.0.1
-rwxr-xr-x. 1 hduser hadoop   54961 Aug 27 07:08 libhadoopsnappy.so.0.0.1
-rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so									// new
-rwxrwxr-x. 1 hduser hadoop  630328 Aug 27 12:19 libhadoop.so.1.0.0								// new
-rw-rw-r--. 1 hduser hadoop  582472 Aug 27 11:14 libhadooputils.a
-rw-rw-r--. 1 hduser hadoop  298626 Aug 27 11:14 libhdfs.a
-rwxrwxr-x. 1 hduser hadoop  200370 Aug 27 11:14 libhdfs.so
-rwxrwxr-x. 1 hduser hadoop  200370 Aug 27 11:14 libhdfs.so.0.0.0
lrwxrwxrwx. 1 hduser hadoop      55 Aug 27 07:08 libjvm.so -> /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 07:08 libprotobuf-lite.so -> libprotobuf-lite.so.8.0.0
lrwxrwxrwx. 1 hduser hadoop      25 Aug 27 07:08 libprotobuf-lite.so.8 -> libprotobuf-lite.so.8.0.0
-rwxr-xr-x. 1 hduser hadoop  964689 Aug 27 07:08 libprotobuf-lite.so.8.0.0
lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 07:08 libprotobuf.so -> libprotobuf.so.8.0.0
lrwxrwxrwx. 1 hduser hadoop      20 Aug 27 07:08 libprotobuf.so.8 -> libprotobuf.so.8.0.0
-rwxr-xr-x. 1 hduser hadoop 8300050 Aug 27 07:08 libprotobuf.so.8.0.0
lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 07:08 libprotoc.so -> libprotoc.so.8.0.0
lrwxrwxrwx. 1 hduser hadoop      18 Aug 27 07:08 libprotoc.so.8 -> libprotoc.so.8.0.0
-rwxr-xr-x. 1 hduser hadoop 9935810 Aug 27 07:08 libprotoc.so.8.0.0
lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:32 libsnappy.so -> /usr/lib64/libsnappy.so
lrwxrwxrwx. 1 hduser hadoop      23 Aug 27 11:33 libsnappy.so.1 -> /usr/lib64/libsnappy.so
-rwxr-xr-x. 1 hduser hadoop  147726 Aug 27 07:08 libsnappy.so.1.2.0
drwxr-xr-x. 2 hduser hadoop    4096 Aug 27 07:08 pkgconfig



sudo yum install snappy snappy-devel
Loaded plugins: fastestmirror, security
Loading mirror speeds from cached hostfile
 ...
Package snappy-1.1.0-1.el6.x86_64 already installed and latest version
Package snappy-devel-1.1.0-1.el6.x86_64 already installed and latest version
Nothing to do


ln -sf /usr/lib64/libsnappy.so $HADOOP_HOME/lib/native/Linux-amd64-64/.
ln -sf /usr/lib64/libsnappy.so $HBASE_HOME/lib/native/Linux-amd64-64/.

ll $HADOOP_HOME/lib/native/Linux-amd64-64/libsnappy.so
lrwxrwxrwx. 1 hduser hadoop 23 Aug 27 11:31 $HADOOP_HOME/lib/native/Linux-amd64-64/libsnappy.so -> /usr/lib64/libsnappy.s
ll $HBASE_HOME/lib/native/Linux-amd64-64/libsnappy.so
lrwxrwxrwx. 1 hduser hadoop 23 Aug 27 11:32 $HBASE_HOME/lib/native/Linux-amd64-64/libsnappy.so -> /usr/lib64/libsnappy.so



($HADOOP_HOME/etc/hadoop/hadoop-env.sh  added following)
### 2014-08-27
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
###

($HBASE_HOME/conf/hbase-env.sh added following)
### 2014-08-27
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
export HBASE_LIBRARY_PATH=$HBASE_LIBRARY_PATH:$HBASE_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/:$HBASE_HOME/lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
export CLASSPATH=$CLASSPATH:$HBASE_LIBRARY_PATH
export HBASE_CLASSPATH=$HBASE_CLASSPATH:$HBASE_LIBRARY_PATH
###


(restarted both HADOOP and HBASE)
jps
26324 HRegionServer
26137 HMaster
25567 JobHistoryServer
25485 NodeManager
25913 WebAppProxyServer
24831 DataNode
24712 NameNode
27146 Jps
9219 QuorumPeerMain
25042 JournalNode
25239 DFSZKFailoverController
25358 ResourceManager


bin/hbase org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test snappy
2014-08-27 12:24:08,030 INFO  [main] Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2014-08-27 12:24:08,387 INFO  [main] util.ChecksumType: Checksum using org.apache.hadoop.util.PureJavaCrc32
2014-08-27 12:24:08,388 INFO  [main] util.ChecksumType: Checksum can use org.apache.hadoop.util.PureJavaCrc32C
Exception in thread "main" java.lang.RuntimeException: native snappy library not available: this version of libhadoop was built without snappy support.
	at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
	at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
	at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
	at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
	at org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
	at org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
	at org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
	at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
	at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
	at org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
	at org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
	at org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
	at org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)


bin/hbase org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test gz
2014-08-27 12:35:34,485 INFO  [main] Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/mnt/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/mnt/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2014-08-27 12:35:35,495 INFO  [main] util.ChecksumType: Checksum using org.apache.hadoop.util.PureJavaCrc32
2014-08-27 12:35:35,495 INFO  [main] util.ChecksumType: Checksum can use org.apache.hadoop.util.PureJavaCrc32C
2014-08-27 12:35:35,822 INFO  [main] zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
2014-08-27 12:35:35,851 INFO  [main] compress.CodecPool: Got brand-new compressor [.gz]
2014-08-27 12:35:35,855 INFO  [main] compress.CodecPool: Got brand-new compressor [.gz]
2014-08-27 12:35:35,866 ERROR [main] hbase.KeyValue: Unexpected getShortMidpointKey result, fakeKey:testkey, firstKeyInBlock:testkey
2014-08-27 12:35:36,636 INFO  [main] compress.CodecPool: Got brand-new decompressor [.gz]
SUCCESS





So still get the same issue,  I feel the issue should come from the hadoop compilation but no idea where would be wrong. Please help.


in my /etc/hadoop/core-site.xml, I have following related to snappy:
   <property>
    <name>io.compression.codecs</name>
    <value>
      org.apache.hadoop.io.compress.GzipCodec,
      org.apache.hadoop.io.compress.DefaultCodec,
      org.apache.hadoop.io.compress.BZip2Codec,
      org.apache.hadoop.io.compress.SnappyCodec
    </value>
   </property>

in my mapred-site.xml, I have following related to snappy:
   <property>
    <name>mapred.output.compress</name>
    <value>false</value>
    <description>Should the job outputs be compressed?</description>
   </property>
   <property>
    <name>mapred.output.compression.type</name>
    <value>RECORD</value>
    <description>If the job outputs are to compressed as SequenceFiles, how should they be compressed? Should be one of NONE, RECORD or BLOCK.</description>
   </property>
   <property>
    <name>mapred.output.compression.codec</name>
    <value>org.apache.hadoop.io.compress.SnappyCodec</value>
    <description>If the job outputs are compressed, how should they be compressed?
    </description>
   </property>
   <property>
    <name>mapred.compress.map.output</name>
    <value>true</value>
    <description>Should the outputs of the maps be compressed before being sent across the network. Uses SequenceFile compression.</description>
   </property>
   <property>
    <name>mapred.map.output.compression.codec</name>
    <value>org.apache.hadoop.io.compress.SnappyCodec</value>
    <description>If the map outputs are compressed, how should they be compressed?</description>
  </property>

  <property>
   <name>mapreduce.map.output.compress</name>  
   <value>true</value>
  </property>
  <property>
   <name>mapred.map.output.compress.codec</name>  
   <value>org.apache.hadoop.io.compress.SnappyCodec</value>
  </property>


I didn’t add any snappy related property to base-site.xml



Regards
Arthur




On 27 Aug, 2014, at 8:07 am, Andrew Purtell <ap...@apache.org> wrote:

> On Tue, Aug 26, 2014 at 4:25 PM, Arthur.hk.chan@gmail.com <
> arthur.hk.chan@gmail.com> wrote:
> 
>> Exception in thread "main" java.lang.RuntimeException: native snappy
>> library not available: this version of libhadoop was built without snappy
>> support.
> 
> ​
> You are almost there. Unfortunately the native Hadoop libraries you copied
> into HBase's lib/native/Linux-amd64-64/ directory were
> ​apparently ​
> built without snappy support, as the exception indicates. You'll need to
> compile the native Hadoop libraries with snappy support enabled. Install
> snappy-revel as Alex mentioned and then build the Hadoop native libraries.
> 
> 1. Get Hadoop sources for the Hadoop version
> 2. tar xvzf ....
> 3. cd /path/to/hadoop/src
> 4. mvn -DskipTests clean install
> ​ -Drequire.snappy=true​
> -Pnative
> 5. cp
> hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/libhadoop.*
> /path/to/hbase/lib/native/Linux-amd64-64
> 
> ​(The -Drequire.snappy=true will fail the build if Snappy link libraries
> are not installed, so you can be sure of this.)​
> 
> 
> -- 
> Best regards,
> 
>   - Andy
> 
> Problems worthy of attack prove their worth by hitting back. - Piet Hein
> (via Tom White)


Re: Compilation error: HBASE 0.98.4 with Snappy

Posted by Andrew Purtell <ap...@apache.org>.
On Tue, Aug 26, 2014 at 4:25 PM, Arthur.hk.chan@gmail.com <
arthur.hk.chan@gmail.com> wrote:

> Exception in thread "main" java.lang.RuntimeException: native snappy
> library not available: this version of libhadoop was built without snappy
> support.

​
You are almost there. Unfortunately the native Hadoop libraries you copied
into HBase's lib/native/Linux-amd64-64/ directory were
​apparently ​
built without snappy support, as the exception indicates. You'll need to
compile the native Hadoop libraries with snappy support enabled. Install
snappy-revel as Alex mentioned and then build the Hadoop native libraries.

1. Get Hadoop sources for the Hadoop version
2. tar xvzf ....
3. cd /path/to/hadoop/src
4. mvn -DskipTests clean install
​ -Drequire.snappy=true​
-Pnative
5. cp
hadoop-common-project/hadoop-common/target/native/target/usr/local/lib/libhadoop.*
/path/to/hbase/lib/native/Linux-amd64-64

​(The -Drequire.snappy=true will fail the build if Snappy link libraries
are not installed, so you can be sure of this.)​


-- 
Best regards,

   - Andy

Problems worthy of attack prove their worth by hitting back. - Piet Hein
(via Tom White)

Re: Compilation error: HBASE 0.98.4 with Snappy

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
Thanks!

java org.apache.hadoop.util.PlatformName
Linux-amd64-64


On 27 Aug, 2014, at 8:31 am, Jean-Marc Spaggiari <je...@spaggiari.org> wrote:

> This command will give you the exact name:
> 
> java org.apache.hadoop.util.PlatformName | sed -e "s/ /_/g"
> 
> Can you try to run it?
> 
> But it's most probably Linux-amd64-64
> 
> 
> 
> 2014-08-26 20:24 GMT-04:00 Arthur.hk.chan@gmail.com <
> arthur.hk.chan@gmail.com>:
> 
>> Hi,
>> 
>> Thanks!
>> 
>> A question:
>> If I run:
>> $  uname -m
>> x86_64
>> 
>> Should I use " lib/native/Linux-amd64-64” or  "lib/native/x86_64”  in
>> $HADOOP_HOME and $HBASE_HOME?
>> 
>> Arthur
>> 
>> 
>> On 27 Aug, 2014, at 8:10 am, Jean-Marc Spaggiari <je...@spaggiari.org>
>> wrote:
>> 
>>> Ok.
>>> 
>>> This is the way the lib path is built:
>>> 
>>> JAVA_LIBRARY_PATH=$(append_path "$JAVA_LIBRARY_PATH"
>>> ${HBASE_HOME}/build/native/${JAVA_PLATFORM}/lib)
>>> 
>>> And JAVA_PLATFORM comes from JAVA_PLATFORM=`CLASSPATH=${CLASSPATH}
>> ${JAVA}
>>> org.apache.hadoop.util.PlatformName | sed -e "s/ /_/g"`
>>> 
>>> You can double check it doing:
>>> 
>>> # Adjust to you java_home...
>>> export JAVA_HOME=/usr/local/jdk1.7.0_45/
>>> 
>>> export CLASSPATH=`bin/hbase classpath`
>>> $JAVA_HOME/bin/java org.apache.hadoop.util.PlatformName | sed -e "s/
>> /_/g"
>>> 
>>> Result for me is this: Linux-amd64-64. Might  be different for you.
>>> 
>>> Then you link the libs the way Alex said before:
>>> cd lib/native/Linux-amd64-64
>>> ln -s /home/hbase/snappy-1.0.5/.libs/libsnappy.so .
>>> ln -s /home/hbase/snappy-1.0.5/.libs/libsnappy.so.1 .
>>> 
>>> AND.....
>>> 
>>> The hadoop so too! And I think this is what's missing for you:
>>> ln -s /YOURHADOOPPATH/libhadoop.so .
>>> 
>>> Your folder should look like this:
>>> jmspaggi@node8:~/hbase-0.98.5-hadoop2/lib/native$ tree
>>> .
>>> └── Linux-amd64-64
>>>   ├── libhadoop.so
>>>   ├── libsnappy.so -> /home/hbase/snappy-1.0.5/.libs/libsnappy.so
>>>   └── libsnappy.so.1 -> /home/hbase/snappy-1.0.5/.libs/libsnappy.so.1
>>> 
>>> I copied libhadoop.so instead of doing a link because it was not
>> available
>>> on this computer.
>>> 
>>> Then test it:
>>> jmspaggi@node8:~/hbase-0.98.5-hadoop2$ bin/hbase
>>> org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test
>> snappy
>>> 2014-08-26 20:06:43,987 INFO  [main] Configuration.deprecation:
>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>>> 2014-08-26 20:06:44,831 INFO  [main] util.ChecksumType: Checksum using
>>> org.apache.hadoop.util.PureJavaCrc32
>>> 2014-08-26 20:06:44,832 INFO  [main] util.ChecksumType: Checksum can use
>>> org.apache.hadoop.util.PureJavaCrc32C
>>> 2014-08-26 20:06:45,125 INFO  [main] compress.CodecPool: Got brand-new
>>> compressor [.snappy]
>>> 2014-08-26 20:06:45,131 INFO  [main] compress.CodecPool: Got brand-new
>>> compressor [.snappy]
>>> 2014-08-26 20:06:45,254 INFO  [main] compress.CodecPool: Got brand-new
>>> decompressor [.snappy]
>>> SUCCESS
>>> 
>>> 
>>> Please let us know if it still doesn't work for you. Without libhadoop.so
>>> it doesn't work for me...
>>> jmspaggi@node8:~/hbase-0.98.5-hadoop2/lib/native$ rm
>>> Linux-amd64-64/libhadoop.so
>>> 
>>> jmspaggi@node8:~/hbase-0.98.5-hadoop2$ bin/hbase
>>> org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test
>> snappy
>>> 2014-08-26 20:09:28,945 INFO  [main] Configuration.deprecation:
>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>>> 2014-08-26 20:09:29,460 WARN  [main] util.NativeCodeLoader: Unable to
>> load
>>> native-hadoop library for your platform... using builtin-java classes
>> where
>>> applicable
>>> 2014-08-26 20:09:29,775 INFO  [main] util.ChecksumType: Checksum using
>>> org.apache.hadoop.util.PureJavaCrc32
>>> 2014-08-26 20:09:29,776 INFO  [main] util.ChecksumType: Checksum can use
>>> org.apache.hadoop.util.PureJavaCrc32C
>>> Exception in thread "main" java.lang.UnsatisfiedLinkError:
>>> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>>>   at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
>>> Method)
>>> ...
>>> 
>>> 
>>> I did all of that using a brand new extracted
>>> hbase-0.98.5-hadoop2-bin.tar.gz file.
>>> 
>>> JM
>>> 
>>> 
>>> 2014-08-26 19:47 GMT-04:00 Arthur.hk.chan@gmail.com <
>>> arthur.hk.chan@gmail.com>:
>>> 
>>>> $ uname -m
>>>> x86_64
>>>> 
>>>> Arthur
>>>> 
>>>> On 27 Aug, 2014, at 7:45 am, Jean-Marc Spaggiari <
>> jean-marc@spaggiari.org>
>>>> wrote:
>>>> 
>>>>> Hi Arthur,
>>>>> 
>>>>> What uname -m gives you? you need to check that to create the right
>>>> folder
>>>>> under the lib directory.
>>>>> 
>>>>> JM
>>>>> 
>>>>> 
>>>>> 2014-08-26 19:43 GMT-04:00 Alex Kamil <al...@gmail.com>:
>>>>> 
>>>>>> Something like this worked for me
>>>>>> 1. get hbase binaries
>>>>>> 2. sudo yum install snappy snappy-devel
>>>>>> 3. ln -sf /usr/lib64/libsnappy.so
>>>>>> /var/lib/hadoop/lib/native/Linux-amd64-64/.
>>>>>> 4. ln -sf /usr/lib64/libsnappy.so
>>>>>> /var/lib/hbase/lib/native/Linux-amd64-64/.
>>>>>> 5. add snappy jar under $HADOOP_HOME/lib and $HBASE_HOME/lib
>>>>>> ref: https://issues.apache.org/jira/browse/PHOENIX-877
>>>>>> 
>>>>>> 
>>>>>> On Tue, Aug 26, 2014 at 7:25 PM, Arthur.hk.chan@gmail.com <
>>>>>> arthur.hk.chan@gmail.com> wrote:
>>>>>> 
>>>>>>> Hi,
>>>>>>> 
>>>>>>> I just tried three more steps but was not able to get thru.
>>>>>>> 
>>>>>>> 
>>>>>>> 1) copied  snappy files to $HBASE_HOME/lib
>>>>>>> $ cd $HBASE_HOME
>>>>>>> $ ll lib/*sna*
>>>>>>> -rw-r--r--. 1 hduser hadoop  11526 Aug 27 06:54
>>>>>>> lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
>>>>>>> -rw-rw-r--. 1 hduser hadoop 995968 Aug  3 18:43
>>>>>> lib/snappy-java-1.0.4.1.jar
>>>>>>> 
>>>>>>> ll lib/native/
>>>>>>> drwxrwxr-x. 4 hduser hadoop 4096 Aug 27 06:54 Linux-amd64-64
>>>>>>> 
>>>>>>> ll lib/native/Linux-amd64-64/
>>>>>>> total 18964
>>>>>>> lrwxrwxrwx. 1 hduser Hadoop      24 Aug 27 07:08 libhadoopsnappy.so
>> ->
>>>>>>> libhadoopsnappy.so.0.0.1
>>>>>>> lrwxrwxrwx. 1 hduser Hadoop      24 Aug 27 07:08 libhadoopsnappy.so.0
>>>> ->
>>>>>>> libhadoopsnappy.so.0.0.1
>>>>>>> -rwxr-xr-x. 1 hduser Hadoop   54961 Aug 27 07:08
>>>> libhadoopsnappy.so.0.0.1
>>>>>>> lrwxrwxrwx. 1 hduser Hadoop      55 Aug 27 07:08 libjvm.so ->
>>>>>>> /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
>>>>>>> lrwxrwxrwx. 1 hduser Hadoop      25 Aug 27 07:08 libprotobuf-lite.so
>> ->
>>>>>>> libprotobuf-lite.so.8.0.0
>>>>>>> lrwxrwxrwx. 1 hduser Hadoop      25 Aug 27 07:08
>> libprotobuf-lite.so.8
>>>> ->
>>>>>>> libprotobuf-lite.so.8.0.0
>>>>>>> -rwxr-xr-x. 1 hduser Hadoop  964689 Aug 27 07:08
>>>>>> libprotobuf-lite.so.8.0.0
>>>>>>> lrwxrwxrwx. 1 hduser Hadoop      20 Aug 27 07:08 libprotobuf.so ->
>>>>>>> libprotobuf.so.8.0.0
>>>>>>> lrwxrwxrwx. 1 hduser Hadoop      20 Aug 27 07:08 libprotobuf.so.8 ->
>>>>>>> libprotobuf.so.8.0.0
>>>>>>> -rwxr-xr-x. 1 hduser Hadoop 8300050 Aug 27 07:08 libprotobuf.so.8.0.0
>>>>>>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libprotoc.so ->
>>>>>>> libprotoc.so.8.0.0
>>>>>>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libprotoc.so.8 ->
>>>>>>> libprotoc.so.8.0.0
>>>>>>> -rwxr-xr-x. 1 hduser Hadoop 9935810 Aug 27 07:08 libprotoc.so.8.0.0
>>>>>>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libsnappy.so ->
>>>>>>> libsnappy.so.1.2.0
>>>>>>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libsnappy.so.1 ->
>>>>>>> libsnappy.so.1.2.0
>>>>>>> -rwxr-xr-x. 1 hduser Hadoop  147726 Aug 27 07:08 libsnappy.so.1.2.0
>>>>>>> drwxr-xr-x. 2 hduser Hadoop    4096 Aug 27 07:08 pkgconfig
>>>>>>> 
>>>>>>> 2)  $HBASE_HOME/conf/hbase-env.sh, added
>>>>>>> 
>>>>>>> ###
>>>>>>> export
>>>>>>> 
>>>>>> 
>>>> 
>> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
>>>>>>> export
>>>>>>> 
>>>>>> 
>>>> 
>> HBASE_LIBRARY_PATH=$HBASE_LIBRARY_PATH:$HBASE_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/:$HBASE_HOME/lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
>>>>>>> export CLASSPATH=$CLASSPATH:$HBASE_LIBRARY_PATH
>>>>>>> export HBASE_CLASSPATH=$HBASE_CLASSPATH:$HBASE_LIBRARY_PATH
>>>>>>> ###
>>>>>>> 
>>>>>>> 3) restart HBASE and tried again
>>>>>>> $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
>>>>>>> file:///tmp/snappy-test snappy
>>>>>>> 2014-08-27 07:16:09,490 INFO  [main] Configuration.deprecation:
>>>>>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>>>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>>>> SLF4J: Found binding in
>>>>>>> 
>>>>>> 
>>>> 
>> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>> SLF4J: Found binding in
>>>>>>> 
>>>>>> 
>>>> 
>> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>>>>> explanation.
>>>>>>> 2014-08-27 07:16:10,323 INFO  [main] util.ChecksumType: Checksum
>> using
>>>>>>> org.apache.hadoop.util.PureJavaCrc32
>>>>>>> 2014-08-27 07:16:10,324 INFO  [main] util.ChecksumType: Checksum can
>>>> use
>>>>>>> org.apache.hadoop.util.PureJavaCrc32C
>>>>>>> Exception in thread "main" java.lang.RuntimeException: native snappy
>>>>>>> library not available: this version of libhadoop was built without
>>>> snappy
>>>>>>> support.
>>>>>>>      at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
>>>>>>>      at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>>>>>>>      at
>>>>>>> 
>>>> 
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>>>>>>>      at
>>>>>>> 
>>>> 
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>>>>>>>      at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
>>>>>>>      at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
>>>>>>>      at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
>>>>>>>      at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
>>>>>>>      at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
>>>>>>>      at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
>>>>>>>      at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
>>>>>>>      at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
>>>>>>>      at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
>>>>>>> 
>>>>>>> 
>>>>>>> Regards
>>>>>>> Arthur
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> On 27 Aug, 2014, at 6:27 am, Arthur.hk.chan@gmail.com <
>>>>>>> arthur.hk.chan@gmail.com> wrote:
>>>>>>> 
>>>>>>>> Hi Sean,
>>>>>>>> 
>>>>>>>> Thanks for your reply.
>>>>>>>> 
>>>>>>>> I tried the following tests
>>>>>>>> 
>>>>>>>> $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
>>>>>>> file:///tmp/snappy-test gz
>>>>>>>> 2014-08-26 23:06:17,778 INFO  [main] Configuration.deprecation:
>>>>>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>>>>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>>>>> SLF4J: Found binding in
>>>>>>> 
>>>>>> 
>>>> 
>> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>>> SLF4J: Found binding in
>>>>>>> 
>>>>>> 
>>>> 
>> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>>>>> explanation.
>>>>>>>> 2014-08-26 23:06:18,103 INFO  [main] util.ChecksumType: Checksum
>> using
>>>>>>> org.apache.hadoop.util.PureJavaCrc32
>>>>>>>> 2014-08-26 23:06:18,104 INFO  [main] util.ChecksumType: Checksum can
>>>>>> use
>>>>>>> org.apache.hadoop.util.PureJavaCrc32C
>>>>>>>> 2014-08-26 23:06:18,260 INFO  [main] zlib.ZlibFactory: Successfully
>>>>>>> loaded & initialized native-zlib library
>>>>>>>> 2014-08-26 23:06:18,276 INFO  [main] compress.CodecPool: Got
>> brand-new
>>>>>>> compressor [.gz]
>>>>>>>> 2014-08-26 23:06:18,280 INFO  [main] compress.CodecPool: Got
>> brand-new
>>>>>>> compressor [.gz]
>>>>>>>> 2014-08-26 23:06:18,921 INFO  [main] compress.CodecPool: Got
>> brand-new
>>>>>>> decompressor [.gz]
>>>>>>>> SUCCESS
>>>>>>>> 
>>>>>>>> 
>>>>>>>> $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
>>>>>>> file:///tmp/snappy-test snappy
>>>>>>>> 2014-08-26 23:07:08,246 INFO  [main] Configuration.deprecation:
>>>>>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>>>>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>>>>> SLF4J: Found binding in
>>>>>>> 
>>>>>> 
>>>> 
>> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>>> SLF4J: Found binding in
>>>>>>> 
>>>>>> 
>>>> 
>> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>>>>> explanation.
>>>>>>>> 2014-08-26 23:07:08,578 INFO  [main] util.ChecksumType: Checksum
>> using
>>>>>>> org.apache.hadoop.util.PureJavaCrc32
>>>>>>>> 2014-08-26 23:07:08,579 INFO  [main] util.ChecksumType: Checksum can
>>>>>> use
>>>>>>> org.apache.hadoop.util.PureJavaCrc32C
>>>>>>>> Exception in thread "main" java.lang.RuntimeException: native snappy
>>>>>>> library not available: this version of libhadoop was built without
>>>> snappy
>>>>>>> support.
>>>>>>>>    at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
>>>>>>>>    at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>>>>>>>>    at
>>>>>>> 
>>>> 
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>>>>>>>>    at
>>>>>>> 
>>>> 
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>>>>>>>>    at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
>>>>>>>>    at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
>>>>>>>>    at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
>>>>>>>>    at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
>>>>>>>>    at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
>>>>>>>>    at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
>>>>>>>>    at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
>>>>>>>>    at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
>>>>>>>>    at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
>>>>>>>> 
>>>>>>>> 
>>>>>>>> $ hbase shell
>>>>>>>> 2014-08-27 06:23:38,707 INFO  [main] Configuration.deprecation:
>>>>>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>>>>>>>> HBase Shell; enter 'help<RETURN>' for list of supported commands.
>>>>>>>> Type "exit<RETURN>" to leave the HBase Shell
>>>>>>>> Version 0.98.4-hadoop2, rUnknown, Sun Aug  3 23:45:36 HKT 2014
>>>>>>>> 
>>>>>>>> hbase(main):001:0>
>>>>>>>> hbase(main):001:0> create 'tsnappy', { NAME => 'f', COMPRESSION =>
>>>>>>> 'snappy'}
>>>>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>>>>> SLF4J: Found binding in
>>>>>>> 
>>>>>> 
>>>> 
>> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>>> SLF4J: Found binding in
>>>>>>> 
>>>>>> 
>>>> 
>> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>>>>> explanation.
>>>>>>>> 
>>>>>>>> ERROR: java.io.IOException: Compression algorithm 'snappy'
>> previously
>>>>>>> failed test.
>>>>>>>>    at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.util.CompressionTest.testCompression(CompressionTest.java:85)
>>>>>>>>    at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1764)
>>>>>>>>    at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1757)
>>>>>>>>    at
>>>>>>> org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1739)
>>>>>>>>    at
>>>>>>> org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1774)
>>>>>>>>    at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40470)
>>>>>>>>    at
>>>>>> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2027)
>>>>>>>>    at
>> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:98)
>>>>>>>>    at
>>>>>>> 
>>>>>> 
>>>> 
>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>>>>    at
>>>>>>> 
>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
>>>>>>>>    at
>>>>>>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>>>>>>>    at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>>>>>>>    at
>>>>>>> 
>>>>>> 
>>>> 
>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
>>>>>>>>    at
>>>>>>> 
>>>>>> 
>>>> 
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
>>>>>>>>    at java.lang.Thread.run(Thread.java:662)
>>>>>>>> 
>>>>>>>> 
>>>>>>>> 
>>>>>>>> 
>>>>>>>> Regards
>>>>>>>> Arthur
>>>>>>>> 
>>>>>>>> 
>>>>>>>> On 26 Aug, 2014, at 11:02 pm, Sean Busbey <bu...@cloudera.com>
>>>> wrote:
>>>>>>>> 
>>>>>>>>> Hi Arthur!
>>>>>>>>> 
>>>>>>>>> Our Snappy build instructions are currently out of date and I'm
>>>>>> working
>>>>>>> on updating them[1]. In short, I don't think there are any special
>>>> build
>>>>>>> steps for using snappy.
>>>>>>>>> 
>>>>>>>>> I'm still working out what needs to be included in our instructions
>>>>>> for
>>>>>>> local and cluster testing.
>>>>>>>>> 
>>>>>>>>> If you use the test for compression options, locally things will
>> fail
>>>>>>> because the native hadoop libs won't be present:
>>>>>>>>> 
>>>>>>>>> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
>>>>>>> file:///tmp/snappy-test snappy
>>>>>>>>> (for comparison, replace "snappy" with "gz" and you will get a
>>>> warning
>>>>>>> about not having native libraries, but the test will succeed.)
>>>>>>>>> 
>>>>>>>>> I believe JM's suggestion is for you to copy the Hadoop native
>>>>>>> libraries into the local HBase lib/native directory, which would
>> allow
>>>>>> the
>>>>>>> local test to pass. If you are running in a deployed Hadoop cluster,
>> I
>>>>>>> would expect the necessary libraries to already be available to
>> HBase.
>>>>>>>>> 
>>>>>>>>> [1]: https://issues.apache.org/jira/browse/HBASE-6189
>>>>>>>>> 
>>>>>>>>> -Sean
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> On Tue, Aug 26, 2014 at 8:30 AM, Arthur.hk.chan@gmail.com <
>>>>>>> arthur.hk.chan@gmail.com> wrote:
>>>>>>>>> Hi JM
>>>>>>>>> 
>>>>>>>>> Below are my commands, tried two cases under same source code
>> folder:
>>>>>>>>> a) compile with snappy parameters(failed),
>>>>>>>>> b) compile without snappy parameters (successful).
>>>>>>>>> 
>>>>>>>>> Regards
>>>>>>>>> Arthur
>>>>>>>>> 
>>>>>>>>> wget
>>>>>>> 
>> http://mirrors.devlib.org/apache/hbase/stable/hbase-0.98.4-src.tar.gz
>>>>>>>>> tar -vxf hbase-0.98.4-src.tar.gz
>>>>>>>>> mv hbase-0.98.4 hbase-0.98.4-src_snappy
>>>>>>>>> cd  hbase-0.98.4-src_snappy
>>>>>>>>> nano dev-support/generate-hadoopX-poms.sh
>>>>>>>>> (change  hbase_home=“/usr/local/hadoop/hbase-0.98.4-src_snappy”)
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4
>> 0.98.4-hadoop2
>>>>>>>>> a) with snappy parameters
>>>>>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
>>>>>>> -Prelease,hadoop-snappy -Dhadoop-snappy.version=0.0.1-SNAPSHOT
>>>>>>>>> [INFO]
>>>>>>> 
>>>> ------------------------------------------------------------------------
>>>>>>>>> [INFO] Building HBase - Server 0.98.4-hadoop2
>>>>>>>>> [INFO]
>>>>>>> 
>>>> ------------------------------------------------------------------------
>>>>>>>>> [WARNING] The POM for
>>>>>>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT is missing, no
>>>>>>> dependency information available
>>>>>>>>> [INFO]
>>>>>>> 
>>>> ------------------------------------------------------------------------
>>>>>>>>> [INFO] Reactor Summary:
>>>>>>>>> [INFO]
>>>>>>>>> [INFO] HBase ............................................. SUCCESS
>>>>>>> [8.192s]
>>>>>>>>> [INFO] HBase - Common .................................... SUCCESS
>>>>>>> [5.638s]
>>>>>>>>> [INFO] HBase - Protocol .................................. SUCCESS
>>>>>>> [1.535s]
>>>>>>>>> [INFO] HBase - Client .................................... SUCCESS
>>>>>>> [1.206s]
>>>>>>>>> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
>>>>>>> [0.193s]
>>>>>>>>> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
>>>>>>> [0.798s]
>>>>>>>>> [INFO] HBase - Prefix Tree ............................... SUCCESS
>>>>>>> [0.438s]
>>>>>>>>> [INFO] HBase - Server .................................... FAILURE
>>>>>>> [0.234s]
>>>>>>>>> [INFO] HBase - Testing Util .............................. SKIPPED
>>>>>>>>> [INFO] HBase - Thrift .................................... SKIPPED
>>>>>>>>> [INFO] HBase - Shell ..................................... SKIPPED
>>>>>>>>> [INFO] HBase - Integration Tests ......................... SKIPPED
>>>>>>>>> [INFO] HBase - Examples .................................. SKIPPED
>>>>>>>>> [INFO] HBase - Assembly .................................. SKIPPED
>>>>>>>>> [INFO]
>>>>>>> 
>>>> ------------------------------------------------------------------------
>>>>>>>>> [INFO] BUILD FAILURE
>>>>>>>>> [INFO]
>>>>>>> 
>>>> ------------------------------------------------------------------------
>>>>>>>>> [INFO] Total time: 19.474s
>>>>>>>>> [INFO] Finished at: Tue Aug 26 21:21:13 HKT 2014
>>>>>>>>> [INFO] Final Memory: 51M/1100M
>>>>>>>>> [INFO]
>>>>>>> 
>>>> ------------------------------------------------------------------------
>>>>>>>>> [ERROR] Failed to execute goal on project hbase-server: Could not
>>>>>>> resolve dependencies for project
>>>>>>> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2: Failure to find
>>>>>>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
>>>>>>> http://maven.oschina.net/content/groups/public/ was cached in the
>>>> local
>>>>>>> repository, resolution will not be reattempted until the update
>>>> interval
>>>>>> of
>>>>>>> nexus-osc has elapsed or updates are forced -> [Help 1]
>>>>>>>>> [ERROR]
>>>>>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven
>> with
>>>>>>> the -e switch.
>>>>>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug
>>>> logging.
>>>>>>>>> [ERROR]
>>>>>>>>> [ERROR] For more information about the errors and possible
>> solutions,
>>>>>>> please read the following articles:
>>>>>>>>> [ERROR] [Help 1]
>>>>>>> 
>>>>>> 
>>>> 
>> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
>>>>>>>>> [ERROR]
>>>>>>>>> [ERROR] After correcting the problems, you can resume the build
>> with
>>>>>>> the command
>>>>>>>>> [ERROR]   mvn <goals> -rf :hbase-server
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> b) try again, without snappy parameters
>>>>>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
>> -Prelease
>>>>>>>>> [INFO] Building tar:
>>>>>>> 
>>>>>> 
>>>> 
>> /edh/hadoop_all_sources/hbase-0.98.4-src_snappy/hbase-assembly/target/hbase-0.98.4-hadoop2-bin.tar.gz
>>>>>>>>> [INFO]
>>>>>>> 
>>>> ------------------------------------------------------------------------
>>>>>>>>> [INFO] Reactor Summary:
>>>>>>>>> [INFO]
>>>>>>>>> [INFO] HBase ............................................. SUCCESS
>>>>>>> [3.290s]
>>>>>>>>> [INFO] HBase - Common .................................... SUCCESS
>>>>>>> [3.119s]
>>>>>>>>> [INFO] HBase - Protocol .................................. SUCCESS
>>>>>>> [0.972s]
>>>>>>>>> [INFO] HBase - Client .................................... SUCCESS
>>>>>>> [0.920s]
>>>>>>>>> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
>>>>>>> [0.167s]
>>>>>>>>> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
>>>>>>> [0.504s]
>>>>>>>>> [INFO] HBase - Prefix Tree ............................... SUCCESS
>>>>>>> [0.382s]
>>>>>>>>> [INFO] HBase - Server .................................... SUCCESS
>>>>>>> [4.790s]
>>>>>>>>> [INFO] HBase - Testing Util .............................. SUCCESS
>>>>>>> [0.598s]
>>>>>>>>> [INFO] HBase - Thrift .................................... SUCCESS
>>>>>>> [1.536s]
>>>>>>>>> [INFO] HBase - Shell ..................................... SUCCESS
>>>>>>> [0.369s]
>>>>>>>>> [INFO] HBase - Integration Tests ......................... SUCCESS
>>>>>>> [0.443s]
>>>>>>>>> [INFO] HBase - Examples .................................. SUCCESS
>>>>>>> [0.459s]
>>>>>>>>> [INFO] HBase - Assembly .................................. SUCCESS
>>>>>>> [13.240s]
>>>>>>>>> [INFO]
>>>>>>> 
>>>> ------------------------------------------------------------------------
>>>>>>>>> [INFO] BUILD SUCCESS
>>>>>>>>> [INFO]
>>>>>>> 
>>>> ------------------------------------------------------------------------
>>>>>>>>> [INFO] Total time: 31.408s
>>>>>>>>> [INFO] Finished at: Tue Aug 26 21:22:50 HKT 2014
>>>>>>>>> [INFO] Final Memory: 57M/1627M
>>>>>>>>> [INFO]
>>>>>>> 
>>>> ------------------------------------------------------------------------
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> On 26 Aug, 2014, at 8:52 pm, Jean-Marc Spaggiari <
>>>>>>> jean-marc@spaggiari.org> wrote:
>>>>>>>>> 
>>>>>>>>>> Hi Arthur,
>>>>>>>>>> 
>>>>>>>>>> How have you extracted HBase source and what command do you run to
>>>>>>> build? I
>>>>>>>>>> will do the same here locally so I can provide you the exact step
>> to
>>>>>>>>>> complete.
>>>>>>>>>> 
>>>>>>>>>> JM
>>>>>>>>>> 
>>>>>>>>>> 
>>>>>>>>>> 2014-08-26 8:42 GMT-04:00 Arthur.hk.chan@gmail.com <
>>>>>>> arthur.hk.chan@gmail.com
>>>>>>>>>>> :
>>>>>>>>>> 
>>>>>>>>>>> Hi JM
>>>>>>>>>>> 
>>>>>>>>>>> Not too sure what you mean, do you mean I should create a new
>>>>>> folder
>>>>>>> in my
>>>>>>>>>>> HBASE_SRC named lib/native/Linux-x86 and copy these files to this
>>>>>>> folder
>>>>>>>>>>> then try to compile it again?
>>>>>>>>>>> 
>>>>>>>>>>> Regards
>>>>>>>>>>> ARthur
>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>>>>>> On 26 Aug, 2014, at 8:17 pm, Jean-Marc Spaggiari <
>>>>>>> jean-marc@spaggiari.org>
>>>>>>>>>>> wrote:
>>>>>>>>>>> 
>>>>>>>>>>>> Hi Arthur,
>>>>>>>>>>>> 
>>>>>>>>>>>> Almost done! You now need to copy them on the HBase folder.
>>>>>>>>>>>> 
>>>>>>>>>>>> hbase@hbasetest1:~/hbase-0.98.2-hadoop2/lib$ tree | grep -v
>> .jar
>>>>>> |
>>>>>>> grep
>>>>>>>>>>> -v
>>>>>>>>>>>> .rb
>>>>>>>>>>>> .
>>>>>>>>>>>> ├── native
>>>>>>>>>>>> │   └── Linux-x86
>>>>>>>>>>>> │       ├── libsnappy.a
>>>>>>>>>>>> │       ├── libsnappy.la
>>>>>>>>>>>> │       ├── libsnappy.so
>>>>>>>>>>>> │       ├── libsnappy.so.1
>>>>>>>>>>>> │       └── libsnappy.so.1.2.0
>>>>>>>>>>>> 
>>>>>>>>>>>> I don't have any hadoop-snappy lib in my hbase folder and it
>> works
>>>>>>> very
>>>>>>>>>>>> well with Snappy for me...
>>>>>>>>>>>> 
>>>>>>>>>>>> JM
>>>>>>>>>>>> 
>>>>>>>>>>>> 2014-08-26 8:09 GMT-04:00 Arthur.hk.chan@gmail.com <
>>>>>>>>>>> arthur.hk.chan@gmail.com
>>>>>>>>>>>>> :
>>>>>>>>>>>> 
>>>>>>>>>>>>> Hi JM,
>>>>>>>>>>>>> 
>>>>>>>>>>>>> Below are my steps to install snappy lib, do I miss something?
>>>>>>>>>>>>> 
>>>>>>>>>>>>> Regards
>>>>>>>>>>>>> Arthur
>>>>>>>>>>>>> 
>>>>>>>>>>>>> wget https://snappy.googlecode.com/files/snappy-1.1.1.tar.gz
>>>>>>>>>>>>> tar -vxf snappy-1.1.1.tar.gz
>>>>>>>>>>>>> cd snappy-1.1.1
>>>>>>>>>>>>> ./configure
>>>>>>>>>>>>> make
>>>>>>>>>>>>> make install
>>>>>>>>>>>>>    make[1]: Entering directory
>>>>>>>>>>> `/edh/hadoop_all_sources/snappy-1.1.1'
>>>>>>>>>>>>>    test -z "/usr/local/lib" || /bin/mkdir -p "/usr/local/lib"
>>>>>>>>>>>>>     /bin/sh ./libtool   --mode=install /usr/bin/install -c
>>>>>>>>>>>>> libsnappy.la '/usr/local/lib'
>>>>>>>>>>>>>    libtool: install: /usr/bin/install -c
>>>>>>> .libs/libsnappy.so.1.2.0
>>>>>>>>>>>>> /usr/local/lib/libsnappy.so.1.2.0
>>>>>>>>>>>>>    libtool: install: (cd /usr/local/lib && { ln -s -f
>>>>>>>>>>>>> libsnappy.so.1.2.0 libsnappy.so.1 || { rm -f libsnappy.so.1 &&
>> ln
>>>>>>> -s
>>>>>>>>>>>>> libsnappy.so.1.2.0 libsnappy.so.1; }; })
>>>>>>>>>>>>>    libtool: install: (cd /usr/local/lib && { ln -s -f
>>>>>>>>>>>>> libsnappy.so.1.2.0 libsnappy.so || { rm -f libsnappy.so && ln
>> -s
>>>>>>>>>>>>> libsnappy.so.1.2.0 libsnappy.so; }; })
>>>>>>>>>>>>>    libtool: install: /usr/bin/install -c .libs/libsnappy.lai
>>>>>>>>>>>>> /usr/local/lib/libsnappy.la
>>>>>>>>>>>>>    libtool: install: /usr/bin/install -c .libs/libsnappy.a
>>>>>>>>>>>>> /usr/local/lib/libsnappy.a
>>>>>>>>>>>>>    libtool: install: chmod 644 /usr/local/lib/libsnappy.a
>>>>>>>>>>>>>    libtool: install: ranlib /usr/local/lib/libsnappy.a
>>>>>>>>>>>>>    libtool: finish:
>>>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>> 
>> PATH="/edh/hadoop/spark/bin:/edh/hadoop/hbase/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/yarn/hadoop/bin:/edh/hadoop/yarn/hadoop/sbin:/usr/lib64/qt-3.3/bin:/opt/apache-maven-3.1.1/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/hive//bin:/usr/lib/jvm/jdk1.6.0_45//bin:/root/bin:/sbin"
>>>>>>>>>>>>> ldconfig -n /usr/local/lib
>>>>>>>>>>>>> 
>>>>>>>>>>>>> 
>>>>>>> 
>> ----------------------------------------------------------------------
>>>>>>>>>>>>>    Libraries have been installed in:
>>>>>>>>>>>>>    /usr/local/lib
>>>>>>>>>>>>>    If you ever happen to want to link against installed
>>>>>>> libraries
>>>>>>>>>>>>>    in a given directory, LIBDIR, you must either use libtool,
>>>>>>> and
>>>>>>>>>>>>>    specify the full pathname of the library, or use the
>>>>>>> `-LLIBDIR'
>>>>>>>>>>>>>    flag during linking and do at least one of the following:
>>>>>>>>>>>>>    - add LIBDIR to the `LD_LIBRARY_PATH' environment variable
>>>>>>>>>>>>>    during execution
>>>>>>>>>>>>>    - add LIBDIR to the `LD_RUN_PATH' environment variable
>>>>>>>>>>>>>    during linking
>>>>>>>>>>>>>    - use the `-Wl,-rpath -Wl,LIBDIR' linker flag
>>>>>>>>>>>>>    - have your system administrator add LIBDIR to
>>>>>>> `/etc/ld.so.conf'
>>>>>>>>>>>>>    See any operating system documentation about shared
>>>>>>> libraries for
>>>>>>>>>>>>>    more information, such as the ld(1) and ld.so(8) manual
>>>>>>> pages.
>>>>>>>>>>>>> 
>>>>>>>>>>>>> 
>>>>>>> 
>> ----------------------------------------------------------------------
>>>>>>>>>>>>>    test -z "/usr/local/share/doc/snappy" || /bin/mkdir -p
>>>>>>>>>>>>> "/usr/local/share/doc/snappy"
>>>>>>>>>>>>>     /usr/bin/install -c -m 644 ChangeLog COPYING INSTALL NEWS
>>>>>>> README
>>>>>>>>>>>>> format_description.txt framing_format.txt
>>>>>>> '/usr/local/share/doc/snappy'
>>>>>>>>>>>>>    test -z "/usr/local/include" || /bin/mkdir -p
>>>>>>>>>>> "/usr/local/include"
>>>>>>>>>>>>>     /usr/bin/install -c -m 644 snappy.h snappy-sinksource.h
>>>>>>>>>>>>> snappy-stubs-public.h snappy-c.h '/usr/local/include'
>>>>>>>>>>>>>    make[1]: Leaving directory
>>>>>>> `/edh/hadoop_all_sources/snappy-1.1.1'
>>>>>>>>>>>>> 
>>>>>>>>>>>>> ll /usr/local/lib
>>>>>>>>>>>>>    -rw-r--r--. 1 root root   233554 Aug 20 00:14 libsnappy.a
>>>>>>>>>>>>>    -rwxr-xr-x. 1 root root      953 Aug 20 00:14 libsnappy.la
>>>>>>>>>>>>>    lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so
>>>>>> ->
>>>>>>>>>>>>> libsnappy.so.1.2.0
>>>>>>>>>>>>>    lrwxrwxrwx. 1 root root       18 Aug 20 00:14
>>>>>> libsnappy.so.1
>>>>>>> ->
>>>>>>>>>>>>> libsnappy.so.1.2.0
>>>>>>>>>>>>>    -rwxr-xr-x. 1 root root   147726 Aug 20 00:14
>>>>>>> libsnappy.so.1.2.0
>>>>>>>>>>>>> 
>>>>>>>>>>>>> 
>>>>>>>>>>>>> 
>>>>>>>>>>>>> On 26 Aug, 2014, at 7:38 pm, Jean-Marc Spaggiari <
>>>>>>>>>>> jean-marc@spaggiari.org>
>>>>>>>>>>>>> wrote:
>>>>>>>>>>>>> 
>>>>>>>>>>>>>> Hi Arthur,
>>>>>>>>>>>>>> 
>>>>>>>>>>>>>> Do you have snappy libs installed and configured? HBase
>> doesn't
>>>>>>> come
>>>>>>>>>>> with
>>>>>>>>>>>>>> Snappy. So yo need to have it first.
>>>>>>>>>>>>>> 
>>>>>>>>>>>>>> Shameless plug:
>>>>>>>>>>>>>> 
>>>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>> 
>> http://www.spaggiari.org/index.php/hbase/how-to-install-snappy-with-1#.U_xxSqdZuZY
>>>>>>>>>>>>>> 
>>>>>>>>>>>>>> This is for 0.96 but should be very similar for 0.98. I will
>> try
>>>>>>> it
>>>>>>>>>>> soon
>>>>>>>>>>>>>> and post and update, but keep us posted here so we can support
>>>>>>> you...
>>>>>>>>>>>>>> 
>>>>>>>>>>>>>> JM
>>>>>>>>>>>>>> 
>>>>>>>>>>>>>> 
>>>>>>>>>>>>>> 2014-08-26 7:34 GMT-04:00 Arthur.hk.chan@gmail.com <
>>>>>>>>>>>>> arthur.hk.chan@gmail.com
>>>>>>>>>>>>>>> :
>>>>>>>>>>>>>> 
>>>>>>>>>>>>>>> Hi,
>>>>>>>>>>>>>>> 
>>>>>>>>>>>>>>> I need to install snappy to HBase 0.98.4.  (my Hadoop version
>>>>>> is
>>>>>>>>>>> 2.4.1)
>>>>>>>>>>>>>>> 
>>>>>>>>>>>>>>> Can you please advise what would be wrong?  Should my pom.xml
>>>>>> be
>>>>>>>>>>>>> incorrect
>>>>>>>>>>>>>>> and missing something?
>>>>>>>>>>>>>>> 
>>>>>>>>>>>>>>> Regards
>>>>>>>>>>>>>>> Arthur
>>>>>>>>>>>>>>> 
>>>>>>>>>>>>>>> 
>>>>>>>>>>>>>>> Below are my commands:
>>>>>>>>>>>>>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4
>>>>>>> 0.98.4-hadoop2
>>>>>>>>>>>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
>>>>>>>>>>>>>>> -Prelease,hadoop-snappy
>>>>>>>>>>>>>>> 
>>>>>>>>>>>>>>> Iog:
>>>>>>>>>>>>>>> [INFO]
>>>>>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>> 
>>>> ------------------------------------------------------------------------
>>>>>>>>>>>>>>> [INFO] Building HBase - Server 0.98.4-hadoop2
>>>>>>>>>>>>>>> [INFO]
>>>>>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>> 
>>>> ------------------------------------------------------------------------
>>>>>>>>>>>>>>> [WARNING] The POM for
>>>>>>>>>>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT
>>>>>>>>>>>>>>> is missing, no dependency information available
>>>>>>>>>>>>>>> [INFO]
>>>>>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>> 
>>>> ------------------------------------------------------------------------
>>>>>>>>>>>>>>> [INFO] Reactor Summary:
>>>>>>>>>>>>>>> [INFO]
>>>>>>>>>>>>>>> [INFO] HBase .............................................
>>>>>>> SUCCESS
>>>>>>>>>>>>> [3.129s]
>>>>>>>>>>>>>>> [INFO] HBase - Common ....................................
>>>>>>> SUCCESS
>>>>>>>>>>>>> [3.105s]
>>>>>>>>>>>>>>> [INFO] HBase - Protocol ..................................
>>>>>>> SUCCESS
>>>>>>>>>>>>> [0.976s]
>>>>>>>>>>>>>>> [INFO] HBase - Client ....................................
>>>>>>> SUCCESS
>>>>>>>>>>>>> [0.925s]
>>>>>>>>>>>>>>> [INFO] HBase - Hadoop Compatibility ......................
>>>>>>> SUCCESS
>>>>>>>>>>>>> [0.183s]
>>>>>>>>>>>>>>> [INFO] HBase - Hadoop Two Compatibility ..................
>>>>>>> SUCCESS
>>>>>>>>>>>>> [0.497s]
>>>>>>>>>>>>>>> [INFO] HBase - Prefix Tree ...............................
>>>>>>> SUCCESS
>>>>>>>>>>>>> [0.407s]
>>>>>>>>>>>>>>> [INFO] HBase - Server ....................................
>>>>>>> FAILURE
>>>>>>>>>>>>> [0.103s]
>>>>>>>>>>>>>>> [INFO] HBase - Testing Util ..............................
>>>>>>> SKIPPED
>>>>>>>>>>>>>>> [INFO] HBase - Thrift ....................................
>>>>>>> SKIPPED
>>>>>>>>>>>>>>> [INFO] HBase - Shell .....................................
>>>>>>> SKIPPED
>>>>>>>>>>>>>>> [INFO] HBase - Integration Tests .........................
>>>>>>> SKIPPED
>>>>>>>>>>>>>>> [INFO] HBase - Examples ..................................
>>>>>>> SKIPPED
>>>>>>>>>>>>>>> [INFO] HBase - Assembly ..................................
>>>>>>> SKIPPED
>>>>>>>>>>>>>>> [INFO]
>>>>>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>> 
>>>> ------------------------------------------------------------------------
>>>>>>>>>>>>>>> [INFO] BUILD FAILURE
>>>>>>>>>>>>>>> [INFO]
>>>>>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>> 
>>>> ------------------------------------------------------------------------
>>>>>>>>>>>>>>> [INFO] Total time: 9.939s
>>>>>>>>>>>>>>> [INFO] Finished at: Tue Aug 26 19:23:14 HKT 2014
>>>>>>>>>>>>>>> [INFO] Final Memory: 61M/2921M
>>>>>>>>>>>>>>> [INFO]
>>>>>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>> 
>>>> ------------------------------------------------------------------------
>>>>>>>>>>>>>>> [ERROR] Failed to execute goal on project hbase-server: Could
>>>>>> not
>>>>>>>>>>>>> resolve
>>>>>>>>>>>>>>> dependencies for project
>>>>>>>>>>>>> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2:
>>>>>>>>>>>>>>> Failure to find
>>>>>>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
>>>>>>>>>>>>>>> http://maven.oschina.net/content/groups/public/ was cached
>> in
>>>>>>> the
>>>>>>>>>>> local
>>>>>>>>>>>>>>> repository, resolution will not be reattempted until the
>> update
>>>>>>>>>>>>> interval of
>>>>>>>>>>>>>>> nexus-osc has elapsed or updates are forced -> [Help 1]
>>>>>>>>>>>>>>> [ERROR]
>>>>>>>>>>>>>>> [ERROR] To see the full stack trace of the errors, re-run
>> Maven
>>>>>>> with
>>>>>>>>>>> the
>>>>>>>>>>>>>>> -e switch.
>>>>>>>>>>>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug
>>>>>>> logging.
>>>>>>>>>>>>>>> [ERROR]
>>>>>>>>>>>>>>> [ERROR] For more information about the errors and possible
>>>>>>> solutions,
>>>>>>>>>>>>>>> please read the following articles:
>>>>>>>>>>>>>>> [ERROR] [Help 1]
>>>>>>>>>>>>>>> 
>>>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>> 
>>>>>> 
>>>> 
>> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
>>>>>>>>>>>>>>> [ERROR]
>>>>>>>>>>>>>>> [ERROR] After correcting the problems, you can resume the
>> build
>>>>>>> with
>>>>>>>>>>> the
>>>>>>>>>>>>>>> command
>>>>>>>>>>>>>>> [ERROR]   mvn <goals> -rf :hbase-server
>>>>>>>>>>>>>>> 
>>>>>>>>>>>>>>> 
>>>>>>>>>>>>> 
>>>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> --
>>>>>>>>> Sean
>>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>> 
>>>> 
>>>> 
>> 
>> 


Re: Compilation error: HBASE 0.98.4 with Snappy

Posted by Jean-Marc Spaggiari <je...@spaggiari.org>.
This command will give you the exact name:

java org.apache.hadoop.util.PlatformName | sed -e "s/ /_/g"

Can you try to run it?

But it's most probably Linux-amd64-64



2014-08-26 20:24 GMT-04:00 Arthur.hk.chan@gmail.com <
arthur.hk.chan@gmail.com>:

> Hi,
>
> Thanks!
>
> A question:
> If I run:
> $  uname -m
> x86_64
>
> Should I use " lib/native/Linux-amd64-64” or  "lib/native/x86_64”  in
> $HADOOP_HOME and $HBASE_HOME?
>
> Arthur
>
>
> On 27 Aug, 2014, at 8:10 am, Jean-Marc Spaggiari <je...@spaggiari.org>
> wrote:
>
> > Ok.
> >
> > This is the way the lib path is built:
> >
> > JAVA_LIBRARY_PATH=$(append_path "$JAVA_LIBRARY_PATH"
> > ${HBASE_HOME}/build/native/${JAVA_PLATFORM}/lib)
> >
> > And JAVA_PLATFORM comes from JAVA_PLATFORM=`CLASSPATH=${CLASSPATH}
> ${JAVA}
> > org.apache.hadoop.util.PlatformName | sed -e "s/ /_/g"`
> >
> > You can double check it doing:
> >
> > # Adjust to you java_home...
> > export JAVA_HOME=/usr/local/jdk1.7.0_45/
> >
> > export CLASSPATH=`bin/hbase classpath`
> > $JAVA_HOME/bin/java org.apache.hadoop.util.PlatformName | sed -e "s/
> /_/g"
> >
> > Result for me is this: Linux-amd64-64. Might  be different for you.
> >
> > Then you link the libs the way Alex said before:
> > cd lib/native/Linux-amd64-64
> > ln -s /home/hbase/snappy-1.0.5/.libs/libsnappy.so .
> > ln -s /home/hbase/snappy-1.0.5/.libs/libsnappy.so.1 .
> >
> > AND.....
> >
> > The hadoop so too! And I think this is what's missing for you:
> > ln -s /YOURHADOOPPATH/libhadoop.so .
> >
> > Your folder should look like this:
> > jmspaggi@node8:~/hbase-0.98.5-hadoop2/lib/native$ tree
> > .
> > └── Linux-amd64-64
> >    ├── libhadoop.so
> >    ├── libsnappy.so -> /home/hbase/snappy-1.0.5/.libs/libsnappy.so
> >    └── libsnappy.so.1 -> /home/hbase/snappy-1.0.5/.libs/libsnappy.so.1
> >
> > I copied libhadoop.so instead of doing a link because it was not
> available
> > on this computer.
> >
> > Then test it:
> > jmspaggi@node8:~/hbase-0.98.5-hadoop2$ bin/hbase
> > org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test
> snappy
> > 2014-08-26 20:06:43,987 INFO  [main] Configuration.deprecation:
> > hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> > 2014-08-26 20:06:44,831 INFO  [main] util.ChecksumType: Checksum using
> > org.apache.hadoop.util.PureJavaCrc32
> > 2014-08-26 20:06:44,832 INFO  [main] util.ChecksumType: Checksum can use
> > org.apache.hadoop.util.PureJavaCrc32C
> > 2014-08-26 20:06:45,125 INFO  [main] compress.CodecPool: Got brand-new
> > compressor [.snappy]
> > 2014-08-26 20:06:45,131 INFO  [main] compress.CodecPool: Got brand-new
> > compressor [.snappy]
> > 2014-08-26 20:06:45,254 INFO  [main] compress.CodecPool: Got brand-new
> > decompressor [.snappy]
> > SUCCESS
> >
> >
> > Please let us know if it still doesn't work for you. Without libhadoop.so
> > it doesn't work for me...
> > jmspaggi@node8:~/hbase-0.98.5-hadoop2/lib/native$ rm
> > Linux-amd64-64/libhadoop.so
> >
> > jmspaggi@node8:~/hbase-0.98.5-hadoop2$ bin/hbase
> > org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test
> snappy
> > 2014-08-26 20:09:28,945 INFO  [main] Configuration.deprecation:
> > hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> > 2014-08-26 20:09:29,460 WARN  [main] util.NativeCodeLoader: Unable to
> load
> > native-hadoop library for your platform... using builtin-java classes
> where
> > applicable
> > 2014-08-26 20:09:29,775 INFO  [main] util.ChecksumType: Checksum using
> > org.apache.hadoop.util.PureJavaCrc32
> > 2014-08-26 20:09:29,776 INFO  [main] util.ChecksumType: Checksum can use
> > org.apache.hadoop.util.PureJavaCrc32C
> > Exception in thread "main" java.lang.UnsatisfiedLinkError:
> > org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
> >    at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
> > Method)
> > ...
> >
> >
> > I did all of that using a brand new extracted
> > hbase-0.98.5-hadoop2-bin.tar.gz file.
> >
> > JM
> >
> >
> > 2014-08-26 19:47 GMT-04:00 Arthur.hk.chan@gmail.com <
> > arthur.hk.chan@gmail.com>:
> >
> >> $ uname -m
> >> x86_64
> >>
> >> Arthur
> >>
> >> On 27 Aug, 2014, at 7:45 am, Jean-Marc Spaggiari <
> jean-marc@spaggiari.org>
> >> wrote:
> >>
> >>> Hi Arthur,
> >>>
> >>> What uname -m gives you? you need to check that to create the right
> >> folder
> >>> under the lib directory.
> >>>
> >>> JM
> >>>
> >>>
> >>> 2014-08-26 19:43 GMT-04:00 Alex Kamil <al...@gmail.com>:
> >>>
> >>>> Something like this worked for me
> >>>> 1. get hbase binaries
> >>>> 2. sudo yum install snappy snappy-devel
> >>>> 3. ln -sf /usr/lib64/libsnappy.so
> >>>> /var/lib/hadoop/lib/native/Linux-amd64-64/.
> >>>> 4. ln -sf /usr/lib64/libsnappy.so
> >>>> /var/lib/hbase/lib/native/Linux-amd64-64/.
> >>>> 5. add snappy jar under $HADOOP_HOME/lib and $HBASE_HOME/lib
> >>>> ref: https://issues.apache.org/jira/browse/PHOENIX-877
> >>>>
> >>>>
> >>>> On Tue, Aug 26, 2014 at 7:25 PM, Arthur.hk.chan@gmail.com <
> >>>> arthur.hk.chan@gmail.com> wrote:
> >>>>
> >>>>> Hi,
> >>>>>
> >>>>> I just tried three more steps but was not able to get thru.
> >>>>>
> >>>>>
> >>>>> 1) copied  snappy files to $HBASE_HOME/lib
> >>>>> $ cd $HBASE_HOME
> >>>>> $ ll lib/*sna*
> >>>>> -rw-r--r--. 1 hduser hadoop  11526 Aug 27 06:54
> >>>>> lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
> >>>>> -rw-rw-r--. 1 hduser hadoop 995968 Aug  3 18:43
> >>>> lib/snappy-java-1.0.4.1.jar
> >>>>>
> >>>>> ll lib/native/
> >>>>> drwxrwxr-x. 4 hduser hadoop 4096 Aug 27 06:54 Linux-amd64-64
> >>>>>
> >>>>> ll lib/native/Linux-amd64-64/
> >>>>> total 18964
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      24 Aug 27 07:08 libhadoopsnappy.so
> ->
> >>>>> libhadoopsnappy.so.0.0.1
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      24 Aug 27 07:08 libhadoopsnappy.so.0
> >> ->
> >>>>> libhadoopsnappy.so.0.0.1
> >>>>> -rwxr-xr-x. 1 hduser Hadoop   54961 Aug 27 07:08
> >> libhadoopsnappy.so.0.0.1
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      55 Aug 27 07:08 libjvm.so ->
> >>>>> /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      25 Aug 27 07:08 libprotobuf-lite.so
> ->
> >>>>> libprotobuf-lite.so.8.0.0
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      25 Aug 27 07:08
> libprotobuf-lite.so.8
> >> ->
> >>>>> libprotobuf-lite.so.8.0.0
> >>>>> -rwxr-xr-x. 1 hduser Hadoop  964689 Aug 27 07:08
> >>>> libprotobuf-lite.so.8.0.0
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      20 Aug 27 07:08 libprotobuf.so ->
> >>>>> libprotobuf.so.8.0.0
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      20 Aug 27 07:08 libprotobuf.so.8 ->
> >>>>> libprotobuf.so.8.0.0
> >>>>> -rwxr-xr-x. 1 hduser Hadoop 8300050 Aug 27 07:08 libprotobuf.so.8.0.0
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libprotoc.so ->
> >>>>> libprotoc.so.8.0.0
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libprotoc.so.8 ->
> >>>>> libprotoc.so.8.0.0
> >>>>> -rwxr-xr-x. 1 hduser Hadoop 9935810 Aug 27 07:08 libprotoc.so.8.0.0
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libsnappy.so ->
> >>>>> libsnappy.so.1.2.0
> >>>>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libsnappy.so.1 ->
> >>>>> libsnappy.so.1.2.0
> >>>>> -rwxr-xr-x. 1 hduser Hadoop  147726 Aug 27 07:08 libsnappy.so.1.2.0
> >>>>> drwxr-xr-x. 2 hduser Hadoop    4096 Aug 27 07:08 pkgconfig
> >>>>>
> >>>>> 2)  $HBASE_HOME/conf/hbase-env.sh, added
> >>>>>
> >>>>> ###
> >>>>> export
> >>>>>
> >>>>
> >>
> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
> >>>>> export
> >>>>>
> >>>>
> >>
> HBASE_LIBRARY_PATH=$HBASE_LIBRARY_PATH:$HBASE_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/:$HBASE_HOME/lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
> >>>>> export CLASSPATH=$CLASSPATH:$HBASE_LIBRARY_PATH
> >>>>> export HBASE_CLASSPATH=$HBASE_CLASSPATH:$HBASE_LIBRARY_PATH
> >>>>> ###
> >>>>>
> >>>>> 3) restart HBASE and tried again
> >>>>> $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> >>>>> file:///tmp/snappy-test snappy
> >>>>> 2014-08-27 07:16:09,490 INFO  [main] Configuration.deprecation:
> >>>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> >>>>> SLF4J: Class path contains multiple SLF4J bindings.
> >>>>> SLF4J: Found binding in
> >>>>>
> >>>>
> >>
> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>>> SLF4J: Found binding in
> >>>>>
> >>>>
> >>
> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> >>>>> explanation.
> >>>>> 2014-08-27 07:16:10,323 INFO  [main] util.ChecksumType: Checksum
> using
> >>>>> org.apache.hadoop.util.PureJavaCrc32
> >>>>> 2014-08-27 07:16:10,324 INFO  [main] util.ChecksumType: Checksum can
> >> use
> >>>>> org.apache.hadoop.util.PureJavaCrc32C
> >>>>> Exception in thread "main" java.lang.RuntimeException: native snappy
> >>>>> library not available: this version of libhadoop was built without
> >> snappy
> >>>>> support.
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
> >>>>>       at
> >>>>>
> >>
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
> >>>>>       at
> >>>>>
> >>
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
> >>>>>       at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
> >>>>>
> >>>>>
> >>>>> Regards
> >>>>> Arthur
> >>>>>
> >>>>>
> >>>>>
> >>>>> On 27 Aug, 2014, at 6:27 am, Arthur.hk.chan@gmail.com <
> >>>>> arthur.hk.chan@gmail.com> wrote:
> >>>>>
> >>>>>> Hi Sean,
> >>>>>>
> >>>>>> Thanks for your reply.
> >>>>>>
> >>>>>> I tried the following tests
> >>>>>>
> >>>>>> $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> >>>>> file:///tmp/snappy-test gz
> >>>>>> 2014-08-26 23:06:17,778 INFO  [main] Configuration.deprecation:
> >>>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> >>>>>> SLF4J: Class path contains multiple SLF4J bindings.
> >>>>>> SLF4J: Found binding in
> >>>>>
> >>>>
> >>
> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>>>> SLF4J: Found binding in
> >>>>>
> >>>>
> >>
> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> >>>>> explanation.
> >>>>>> 2014-08-26 23:06:18,103 INFO  [main] util.ChecksumType: Checksum
> using
> >>>>> org.apache.hadoop.util.PureJavaCrc32
> >>>>>> 2014-08-26 23:06:18,104 INFO  [main] util.ChecksumType: Checksum can
> >>>> use
> >>>>> org.apache.hadoop.util.PureJavaCrc32C
> >>>>>> 2014-08-26 23:06:18,260 INFO  [main] zlib.ZlibFactory: Successfully
> >>>>> loaded & initialized native-zlib library
> >>>>>> 2014-08-26 23:06:18,276 INFO  [main] compress.CodecPool: Got
> brand-new
> >>>>> compressor [.gz]
> >>>>>> 2014-08-26 23:06:18,280 INFO  [main] compress.CodecPool: Got
> brand-new
> >>>>> compressor [.gz]
> >>>>>> 2014-08-26 23:06:18,921 INFO  [main] compress.CodecPool: Got
> brand-new
> >>>>> decompressor [.gz]
> >>>>>> SUCCESS
> >>>>>>
> >>>>>>
> >>>>>> $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> >>>>> file:///tmp/snappy-test snappy
> >>>>>> 2014-08-26 23:07:08,246 INFO  [main] Configuration.deprecation:
> >>>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> >>>>>> SLF4J: Class path contains multiple SLF4J bindings.
> >>>>>> SLF4J: Found binding in
> >>>>>
> >>>>
> >>
> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>>>> SLF4J: Found binding in
> >>>>>
> >>>>
> >>
> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> >>>>> explanation.
> >>>>>> 2014-08-26 23:07:08,578 INFO  [main] util.ChecksumType: Checksum
> using
> >>>>> org.apache.hadoop.util.PureJavaCrc32
> >>>>>> 2014-08-26 23:07:08,579 INFO  [main] util.ChecksumType: Checksum can
> >>>> use
> >>>>> org.apache.hadoop.util.PureJavaCrc32C
> >>>>>> Exception in thread "main" java.lang.RuntimeException: native snappy
> >>>>> library not available: this version of libhadoop was built without
> >> snappy
> >>>>> support.
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
> >>>>>>     at
> >>>>>
> >>
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
> >>>>>>     at
> >>>>>
> >>
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
> >>>>>>
> >>>>>>
> >>>>>> $ hbase shell
> >>>>>> 2014-08-27 06:23:38,707 INFO  [main] Configuration.deprecation:
> >>>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> >>>>>> HBase Shell; enter 'help<RETURN>' for list of supported commands.
> >>>>>> Type "exit<RETURN>" to leave the HBase Shell
> >>>>>> Version 0.98.4-hadoop2, rUnknown, Sun Aug  3 23:45:36 HKT 2014
> >>>>>>
> >>>>>> hbase(main):001:0>
> >>>>>> hbase(main):001:0> create 'tsnappy', { NAME => 'f', COMPRESSION =>
> >>>>> 'snappy'}
> >>>>>> SLF4J: Class path contains multiple SLF4J bindings.
> >>>>>> SLF4J: Found binding in
> >>>>>
> >>>>
> >>
> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>>>> SLF4J: Found binding in
> >>>>>
> >>>>
> >>
> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> >>>>> explanation.
> >>>>>>
> >>>>>> ERROR: java.io.IOException: Compression algorithm 'snappy'
> previously
> >>>>> failed test.
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.util.CompressionTest.testCompression(CompressionTest.java:85)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1764)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1757)
> >>>>>>     at
> >>>>> org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1739)
> >>>>>>     at
> >>>>> org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1774)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40470)
> >>>>>>     at
> >>>> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2027)
> >>>>>>     at
> org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:98)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
> >>>>>>     at
> >>>>>
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
> >>>>>>     at
> >>>>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> >>>>>>     at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
> >>>>>>     at
> >>>>>
> >>>>
> >>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
> >>>>>>     at java.lang.Thread.run(Thread.java:662)
> >>>>>>
> >>>>>>
> >>>>>>
> >>>>>>
> >>>>>> Regards
> >>>>>> Arthur
> >>>>>>
> >>>>>>
> >>>>>> On 26 Aug, 2014, at 11:02 pm, Sean Busbey <bu...@cloudera.com>
> >> wrote:
> >>>>>>
> >>>>>>> Hi Arthur!
> >>>>>>>
> >>>>>>> Our Snappy build instructions are currently out of date and I'm
> >>>> working
> >>>>> on updating them[1]. In short, I don't think there are any special
> >> build
> >>>>> steps for using snappy.
> >>>>>>>
> >>>>>>> I'm still working out what needs to be included in our instructions
> >>>> for
> >>>>> local and cluster testing.
> >>>>>>>
> >>>>>>> If you use the test for compression options, locally things will
> fail
> >>>>> because the native hadoop libs won't be present:
> >>>>>>>
> >>>>>>> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> >>>>> file:///tmp/snappy-test snappy
> >>>>>>> (for comparison, replace "snappy" with "gz" and you will get a
> >> warning
> >>>>> about not having native libraries, but the test will succeed.)
> >>>>>>>
> >>>>>>> I believe JM's suggestion is for you to copy the Hadoop native
> >>>>> libraries into the local HBase lib/native directory, which would
> allow
> >>>> the
> >>>>> local test to pass. If you are running in a deployed Hadoop cluster,
> I
> >>>>> would expect the necessary libraries to already be available to
> HBase.
> >>>>>>>
> >>>>>>> [1]: https://issues.apache.org/jira/browse/HBASE-6189
> >>>>>>>
> >>>>>>> -Sean
> >>>>>>>
> >>>>>>>
> >>>>>>> On Tue, Aug 26, 2014 at 8:30 AM, Arthur.hk.chan@gmail.com <
> >>>>> arthur.hk.chan@gmail.com> wrote:
> >>>>>>> Hi JM
> >>>>>>>
> >>>>>>> Below are my commands, tried two cases under same source code
> folder:
> >>>>>>> a) compile with snappy parameters(failed),
> >>>>>>> b) compile without snappy parameters (successful).
> >>>>>>>
> >>>>>>> Regards
> >>>>>>> Arthur
> >>>>>>>
> >>>>>>> wget
> >>>>>
> http://mirrors.devlib.org/apache/hbase/stable/hbase-0.98.4-src.tar.gz
> >>>>>>> tar -vxf hbase-0.98.4-src.tar.gz
> >>>>>>> mv hbase-0.98.4 hbase-0.98.4-src_snappy
> >>>>>>> cd  hbase-0.98.4-src_snappy
> >>>>>>> nano dev-support/generate-hadoopX-poms.sh
> >>>>>>> (change  hbase_home=“/usr/local/hadoop/hbase-0.98.4-src_snappy”)
> >>>>>>>
> >>>>>>>
> >>>>>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4
> 0.98.4-hadoop2
> >>>>>>> a) with snappy parameters
> >>>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
> >>>>> -Prelease,hadoop-snappy -Dhadoop-snappy.version=0.0.1-SNAPSHOT
> >>>>>>> [INFO]
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>> [INFO] Building HBase - Server 0.98.4-hadoop2
> >>>>>>> [INFO]
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>> [WARNING] The POM for
> >>>>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT is missing, no
> >>>>> dependency information available
> >>>>>>> [INFO]
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>> [INFO] Reactor Summary:
> >>>>>>> [INFO]
> >>>>>>> [INFO] HBase ............................................. SUCCESS
> >>>>> [8.192s]
> >>>>>>> [INFO] HBase - Common .................................... SUCCESS
> >>>>> [5.638s]
> >>>>>>> [INFO] HBase - Protocol .................................. SUCCESS
> >>>>> [1.535s]
> >>>>>>> [INFO] HBase - Client .................................... SUCCESS
> >>>>> [1.206s]
> >>>>>>> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
> >>>>> [0.193s]
> >>>>>>> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
> >>>>> [0.798s]
> >>>>>>> [INFO] HBase - Prefix Tree ............................... SUCCESS
> >>>>> [0.438s]
> >>>>>>> [INFO] HBase - Server .................................... FAILURE
> >>>>> [0.234s]
> >>>>>>> [INFO] HBase - Testing Util .............................. SKIPPED
> >>>>>>> [INFO] HBase - Thrift .................................... SKIPPED
> >>>>>>> [INFO] HBase - Shell ..................................... SKIPPED
> >>>>>>> [INFO] HBase - Integration Tests ......................... SKIPPED
> >>>>>>> [INFO] HBase - Examples .................................. SKIPPED
> >>>>>>> [INFO] HBase - Assembly .................................. SKIPPED
> >>>>>>> [INFO]
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>> [INFO] BUILD FAILURE
> >>>>>>> [INFO]
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>> [INFO] Total time: 19.474s
> >>>>>>> [INFO] Finished at: Tue Aug 26 21:21:13 HKT 2014
> >>>>>>> [INFO] Final Memory: 51M/1100M
> >>>>>>> [INFO]
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>> [ERROR] Failed to execute goal on project hbase-server: Could not
> >>>>> resolve dependencies for project
> >>>>> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2: Failure to find
> >>>>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
> >>>>> http://maven.oschina.net/content/groups/public/ was cached in the
> >> local
> >>>>> repository, resolution will not be reattempted until the update
> >> interval
> >>>> of
> >>>>> nexus-osc has elapsed or updates are forced -> [Help 1]
> >>>>>>> [ERROR]
> >>>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven
> with
> >>>>> the -e switch.
> >>>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug
> >> logging.
> >>>>>>> [ERROR]
> >>>>>>> [ERROR] For more information about the errors and possible
> solutions,
> >>>>> please read the following articles:
> >>>>>>> [ERROR] [Help 1]
> >>>>>
> >>>>
> >>
> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
> >>>>>>> [ERROR]
> >>>>>>> [ERROR] After correcting the problems, you can resume the build
> with
> >>>>> the command
> >>>>>>> [ERROR]   mvn <goals> -rf :hbase-server
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>> b) try again, without snappy parameters
> >>>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
> -Prelease
> >>>>>>> [INFO] Building tar:
> >>>>>
> >>>>
> >>
> /edh/hadoop_all_sources/hbase-0.98.4-src_snappy/hbase-assembly/target/hbase-0.98.4-hadoop2-bin.tar.gz
> >>>>>>> [INFO]
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>> [INFO] Reactor Summary:
> >>>>>>> [INFO]
> >>>>>>> [INFO] HBase ............................................. SUCCESS
> >>>>> [3.290s]
> >>>>>>> [INFO] HBase - Common .................................... SUCCESS
> >>>>> [3.119s]
> >>>>>>> [INFO] HBase - Protocol .................................. SUCCESS
> >>>>> [0.972s]
> >>>>>>> [INFO] HBase - Client .................................... SUCCESS
> >>>>> [0.920s]
> >>>>>>> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
> >>>>> [0.167s]
> >>>>>>> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
> >>>>> [0.504s]
> >>>>>>> [INFO] HBase - Prefix Tree ............................... SUCCESS
> >>>>> [0.382s]
> >>>>>>> [INFO] HBase - Server .................................... SUCCESS
> >>>>> [4.790s]
> >>>>>>> [INFO] HBase - Testing Util .............................. SUCCESS
> >>>>> [0.598s]
> >>>>>>> [INFO] HBase - Thrift .................................... SUCCESS
> >>>>> [1.536s]
> >>>>>>> [INFO] HBase - Shell ..................................... SUCCESS
> >>>>> [0.369s]
> >>>>>>> [INFO] HBase - Integration Tests ......................... SUCCESS
> >>>>> [0.443s]
> >>>>>>> [INFO] HBase - Examples .................................. SUCCESS
> >>>>> [0.459s]
> >>>>>>> [INFO] HBase - Assembly .................................. SUCCESS
> >>>>> [13.240s]
> >>>>>>> [INFO]
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>> [INFO] BUILD SUCCESS
> >>>>>>> [INFO]
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>> [INFO] Total time: 31.408s
> >>>>>>> [INFO] Finished at: Tue Aug 26 21:22:50 HKT 2014
> >>>>>>> [INFO] Final Memory: 57M/1627M
> >>>>>>> [INFO]
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>> On 26 Aug, 2014, at 8:52 pm, Jean-Marc Spaggiari <
> >>>>> jean-marc@spaggiari.org> wrote:
> >>>>>>>
> >>>>>>>> Hi Arthur,
> >>>>>>>>
> >>>>>>>> How have you extracted HBase source and what command do you run to
> >>>>> build? I
> >>>>>>>> will do the same here locally so I can provide you the exact step
> to
> >>>>>>>> complete.
> >>>>>>>>
> >>>>>>>> JM
> >>>>>>>>
> >>>>>>>>
> >>>>>>>> 2014-08-26 8:42 GMT-04:00 Arthur.hk.chan@gmail.com <
> >>>>> arthur.hk.chan@gmail.com
> >>>>>>>>> :
> >>>>>>>>
> >>>>>>>>> Hi JM
> >>>>>>>>>
> >>>>>>>>> Not too sure what you mean, do you mean I should create a new
> >>>> folder
> >>>>> in my
> >>>>>>>>> HBASE_SRC named lib/native/Linux-x86 and copy these files to this
> >>>>> folder
> >>>>>>>>> then try to compile it again?
> >>>>>>>>>
> >>>>>>>>> Regards
> >>>>>>>>> ARthur
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>> On 26 Aug, 2014, at 8:17 pm, Jean-Marc Spaggiari <
> >>>>> jean-marc@spaggiari.org>
> >>>>>>>>> wrote:
> >>>>>>>>>
> >>>>>>>>>> Hi Arthur,
> >>>>>>>>>>
> >>>>>>>>>> Almost done! You now need to copy them on the HBase folder.
> >>>>>>>>>>
> >>>>>>>>>> hbase@hbasetest1:~/hbase-0.98.2-hadoop2/lib$ tree | grep -v
> .jar
> >>>> |
> >>>>> grep
> >>>>>>>>> -v
> >>>>>>>>>> .rb
> >>>>>>>>>> .
> >>>>>>>>>> ├── native
> >>>>>>>>>> │   └── Linux-x86
> >>>>>>>>>> │       ├── libsnappy.a
> >>>>>>>>>> │       ├── libsnappy.la
> >>>>>>>>>> │       ├── libsnappy.so
> >>>>>>>>>> │       ├── libsnappy.so.1
> >>>>>>>>>> │       └── libsnappy.so.1.2.0
> >>>>>>>>>>
> >>>>>>>>>> I don't have any hadoop-snappy lib in my hbase folder and it
> works
> >>>>> very
> >>>>>>>>>> well with Snappy for me...
> >>>>>>>>>>
> >>>>>>>>>> JM
> >>>>>>>>>>
> >>>>>>>>>> 2014-08-26 8:09 GMT-04:00 Arthur.hk.chan@gmail.com <
> >>>>>>>>> arthur.hk.chan@gmail.com
> >>>>>>>>>>> :
> >>>>>>>>>>
> >>>>>>>>>>> Hi JM,
> >>>>>>>>>>>
> >>>>>>>>>>> Below are my steps to install snappy lib, do I miss something?
> >>>>>>>>>>>
> >>>>>>>>>>> Regards
> >>>>>>>>>>> Arthur
> >>>>>>>>>>>
> >>>>>>>>>>> wget https://snappy.googlecode.com/files/snappy-1.1.1.tar.gz
> >>>>>>>>>>> tar -vxf snappy-1.1.1.tar.gz
> >>>>>>>>>>> cd snappy-1.1.1
> >>>>>>>>>>> ./configure
> >>>>>>>>>>> make
> >>>>>>>>>>> make install
> >>>>>>>>>>>     make[1]: Entering directory
> >>>>>>>>> `/edh/hadoop_all_sources/snappy-1.1.1'
> >>>>>>>>>>>     test -z "/usr/local/lib" || /bin/mkdir -p "/usr/local/lib"
> >>>>>>>>>>>      /bin/sh ./libtool   --mode=install /usr/bin/install -c
> >>>>>>>>>>> libsnappy.la '/usr/local/lib'
> >>>>>>>>>>>     libtool: install: /usr/bin/install -c
> >>>>> .libs/libsnappy.so.1.2.0
> >>>>>>>>>>> /usr/local/lib/libsnappy.so.1.2.0
> >>>>>>>>>>>     libtool: install: (cd /usr/local/lib && { ln -s -f
> >>>>>>>>>>> libsnappy.so.1.2.0 libsnappy.so.1 || { rm -f libsnappy.so.1 &&
> ln
> >>>>> -s
> >>>>>>>>>>> libsnappy.so.1.2.0 libsnappy.so.1; }; })
> >>>>>>>>>>>     libtool: install: (cd /usr/local/lib && { ln -s -f
> >>>>>>>>>>> libsnappy.so.1.2.0 libsnappy.so || { rm -f libsnappy.so && ln
> -s
> >>>>>>>>>>> libsnappy.so.1.2.0 libsnappy.so; }; })
> >>>>>>>>>>>     libtool: install: /usr/bin/install -c .libs/libsnappy.lai
> >>>>>>>>>>> /usr/local/lib/libsnappy.la
> >>>>>>>>>>>     libtool: install: /usr/bin/install -c .libs/libsnappy.a
> >>>>>>>>>>> /usr/local/lib/libsnappy.a
> >>>>>>>>>>>     libtool: install: chmod 644 /usr/local/lib/libsnappy.a
> >>>>>>>>>>>     libtool: install: ranlib /usr/local/lib/libsnappy.a
> >>>>>>>>>>>     libtool: finish:
> >>>>>>>>>>>
> >>>>>>>>>
> >>>>>
> >>>>
> >>
> PATH="/edh/hadoop/spark/bin:/edh/hadoop/hbase/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/yarn/hadoop/bin:/edh/hadoop/yarn/hadoop/sbin:/usr/lib64/qt-3.3/bin:/opt/apache-maven-3.1.1/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/hive//bin:/usr/lib/jvm/jdk1.6.0_45//bin:/root/bin:/sbin"
> >>>>>>>>>>> ldconfig -n /usr/local/lib
> >>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>
> ----------------------------------------------------------------------
> >>>>>>>>>>>     Libraries have been installed in:
> >>>>>>>>>>>     /usr/local/lib
> >>>>>>>>>>>     If you ever happen to want to link against installed
> >>>>> libraries
> >>>>>>>>>>>     in a given directory, LIBDIR, you must either use libtool,
> >>>>> and
> >>>>>>>>>>>     specify the full pathname of the library, or use the
> >>>>> `-LLIBDIR'
> >>>>>>>>>>>     flag during linking and do at least one of the following:
> >>>>>>>>>>>     - add LIBDIR to the `LD_LIBRARY_PATH' environment variable
> >>>>>>>>>>>     during execution
> >>>>>>>>>>>     - add LIBDIR to the `LD_RUN_PATH' environment variable
> >>>>>>>>>>>     during linking
> >>>>>>>>>>>     - use the `-Wl,-rpath -Wl,LIBDIR' linker flag
> >>>>>>>>>>>     - have your system administrator add LIBDIR to
> >>>>> `/etc/ld.so.conf'
> >>>>>>>>>>>     See any operating system documentation about shared
> >>>>> libraries for
> >>>>>>>>>>>     more information, such as the ld(1) and ld.so(8) manual
> >>>>> pages.
> >>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>
> ----------------------------------------------------------------------
> >>>>>>>>>>>     test -z "/usr/local/share/doc/snappy" || /bin/mkdir -p
> >>>>>>>>>>> "/usr/local/share/doc/snappy"
> >>>>>>>>>>>      /usr/bin/install -c -m 644 ChangeLog COPYING INSTALL NEWS
> >>>>> README
> >>>>>>>>>>> format_description.txt framing_format.txt
> >>>>> '/usr/local/share/doc/snappy'
> >>>>>>>>>>>     test -z "/usr/local/include" || /bin/mkdir -p
> >>>>>>>>> "/usr/local/include"
> >>>>>>>>>>>      /usr/bin/install -c -m 644 snappy.h snappy-sinksource.h
> >>>>>>>>>>> snappy-stubs-public.h snappy-c.h '/usr/local/include'
> >>>>>>>>>>>     make[1]: Leaving directory
> >>>>> `/edh/hadoop_all_sources/snappy-1.1.1'
> >>>>>>>>>>>
> >>>>>>>>>>> ll /usr/local/lib
> >>>>>>>>>>>     -rw-r--r--. 1 root root   233554 Aug 20 00:14 libsnappy.a
> >>>>>>>>>>>     -rwxr-xr-x. 1 root root      953 Aug 20 00:14 libsnappy.la
> >>>>>>>>>>>     lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so
> >>>> ->
> >>>>>>>>>>> libsnappy.so.1.2.0
> >>>>>>>>>>>     lrwxrwxrwx. 1 root root       18 Aug 20 00:14
> >>>> libsnappy.so.1
> >>>>> ->
> >>>>>>>>>>> libsnappy.so.1.2.0
> >>>>>>>>>>>     -rwxr-xr-x. 1 root root   147726 Aug 20 00:14
> >>>>> libsnappy.so.1.2.0
> >>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>>>>>>> On 26 Aug, 2014, at 7:38 pm, Jean-Marc Spaggiari <
> >>>>>>>>> jean-marc@spaggiari.org>
> >>>>>>>>>>> wrote:
> >>>>>>>>>>>
> >>>>>>>>>>>> Hi Arthur,
> >>>>>>>>>>>>
> >>>>>>>>>>>> Do you have snappy libs installed and configured? HBase
> doesn't
> >>>>> come
> >>>>>>>>> with
> >>>>>>>>>>>> Snappy. So yo need to have it first.
> >>>>>>>>>>>>
> >>>>>>>>>>>> Shameless plug:
> >>>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>>>>>
> >>>>>
> >>>>
> >>
> http://www.spaggiari.org/index.php/hbase/how-to-install-snappy-with-1#.U_xxSqdZuZY
> >>>>>>>>>>>>
> >>>>>>>>>>>> This is for 0.96 but should be very similar for 0.98. I will
> try
> >>>>> it
> >>>>>>>>> soon
> >>>>>>>>>>>> and post and update, but keep us posted here so we can support
> >>>>> you...
> >>>>>>>>>>>>
> >>>>>>>>>>>> JM
> >>>>>>>>>>>>
> >>>>>>>>>>>>
> >>>>>>>>>>>> 2014-08-26 7:34 GMT-04:00 Arthur.hk.chan@gmail.com <
> >>>>>>>>>>> arthur.hk.chan@gmail.com
> >>>>>>>>>>>>> :
> >>>>>>>>>>>>
> >>>>>>>>>>>>> Hi,
> >>>>>>>>>>>>>
> >>>>>>>>>>>>> I need to install snappy to HBase 0.98.4.  (my Hadoop version
> >>>> is
> >>>>>>>>> 2.4.1)
> >>>>>>>>>>>>>
> >>>>>>>>>>>>> Can you please advise what would be wrong?  Should my pom.xml
> >>>> be
> >>>>>>>>>>> incorrect
> >>>>>>>>>>>>> and missing something?
> >>>>>>>>>>>>>
> >>>>>>>>>>>>> Regards
> >>>>>>>>>>>>> Arthur
> >>>>>>>>>>>>>
> >>>>>>>>>>>>>
> >>>>>>>>>>>>> Below are my commands:
> >>>>>>>>>>>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4
> >>>>> 0.98.4-hadoop2
> >>>>>>>>>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
> >>>>>>>>>>>>> -Prelease,hadoop-snappy
> >>>>>>>>>>>>>
> >>>>>>>>>>>>> Iog:
> >>>>>>>>>>>>> [INFO]
> >>>>>>>>>>>>>
> >>>>>>>>>
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>>>>>>>> [INFO] Building HBase - Server 0.98.4-hadoop2
> >>>>>>>>>>>>> [INFO]
> >>>>>>>>>>>>>
> >>>>>>>>>
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>>>>>>>> [WARNING] The POM for
> >>>>>>>>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT
> >>>>>>>>>>>>> is missing, no dependency information available
> >>>>>>>>>>>>> [INFO]
> >>>>>>>>>>>>>
> >>>>>>>>>
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>>>>>>>> [INFO] Reactor Summary:
> >>>>>>>>>>>>> [INFO]
> >>>>>>>>>>>>> [INFO] HBase .............................................
> >>>>> SUCCESS
> >>>>>>>>>>> [3.129s]
> >>>>>>>>>>>>> [INFO] HBase - Common ....................................
> >>>>> SUCCESS
> >>>>>>>>>>> [3.105s]
> >>>>>>>>>>>>> [INFO] HBase - Protocol ..................................
> >>>>> SUCCESS
> >>>>>>>>>>> [0.976s]
> >>>>>>>>>>>>> [INFO] HBase - Client ....................................
> >>>>> SUCCESS
> >>>>>>>>>>> [0.925s]
> >>>>>>>>>>>>> [INFO] HBase - Hadoop Compatibility ......................
> >>>>> SUCCESS
> >>>>>>>>>>> [0.183s]
> >>>>>>>>>>>>> [INFO] HBase - Hadoop Two Compatibility ..................
> >>>>> SUCCESS
> >>>>>>>>>>> [0.497s]
> >>>>>>>>>>>>> [INFO] HBase - Prefix Tree ...............................
> >>>>> SUCCESS
> >>>>>>>>>>> [0.407s]
> >>>>>>>>>>>>> [INFO] HBase - Server ....................................
> >>>>> FAILURE
> >>>>>>>>>>> [0.103s]
> >>>>>>>>>>>>> [INFO] HBase - Testing Util ..............................
> >>>>> SKIPPED
> >>>>>>>>>>>>> [INFO] HBase - Thrift ....................................
> >>>>> SKIPPED
> >>>>>>>>>>>>> [INFO] HBase - Shell .....................................
> >>>>> SKIPPED
> >>>>>>>>>>>>> [INFO] HBase - Integration Tests .........................
> >>>>> SKIPPED
> >>>>>>>>>>>>> [INFO] HBase - Examples ..................................
> >>>>> SKIPPED
> >>>>>>>>>>>>> [INFO] HBase - Assembly ..................................
> >>>>> SKIPPED
> >>>>>>>>>>>>> [INFO]
> >>>>>>>>>>>>>
> >>>>>>>>>
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>>>>>>>> [INFO] BUILD FAILURE
> >>>>>>>>>>>>> [INFO]
> >>>>>>>>>>>>>
> >>>>>>>>>
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>>>>>>>> [INFO] Total time: 9.939s
> >>>>>>>>>>>>> [INFO] Finished at: Tue Aug 26 19:23:14 HKT 2014
> >>>>>>>>>>>>> [INFO] Final Memory: 61M/2921M
> >>>>>>>>>>>>> [INFO]
> >>>>>>>>>>>>>
> >>>>>>>>>
> >>>>>
> >> ------------------------------------------------------------------------
> >>>>>>>>>>>>> [ERROR] Failed to execute goal on project hbase-server: Could
> >>>> not
> >>>>>>>>>>> resolve
> >>>>>>>>>>>>> dependencies for project
> >>>>>>>>>>> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2:
> >>>>>>>>>>>>> Failure to find
> >>>>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
> >>>>>>>>>>>>> http://maven.oschina.net/content/groups/public/ was cached
> in
> >>>>> the
> >>>>>>>>> local
> >>>>>>>>>>>>> repository, resolution will not be reattempted until the
> update
> >>>>>>>>>>> interval of
> >>>>>>>>>>>>> nexus-osc has elapsed or updates are forced -> [Help 1]
> >>>>>>>>>>>>> [ERROR]
> >>>>>>>>>>>>> [ERROR] To see the full stack trace of the errors, re-run
> Maven
> >>>>> with
> >>>>>>>>> the
> >>>>>>>>>>>>> -e switch.
> >>>>>>>>>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug
> >>>>> logging.
> >>>>>>>>>>>>> [ERROR]
> >>>>>>>>>>>>> [ERROR] For more information about the errors and possible
> >>>>> solutions,
> >>>>>>>>>>>>> please read the following articles:
> >>>>>>>>>>>>> [ERROR] [Help 1]
> >>>>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>>>>>
> >>>>>
> >>>>
> >>
> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
> >>>>>>>>>>>>> [ERROR]
> >>>>>>>>>>>>> [ERROR] After correcting the problems, you can resume the
> build
> >>>>> with
> >>>>>>>>> the
> >>>>>>>>>>>>> command
> >>>>>>>>>>>>> [ERROR]   mvn <goals> -rf :hbase-server
> >>>>>>>>>>>>>
> >>>>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>>> --
> >>>>>>> Sean
> >>>>>>
> >>>>>
> >>>>>
> >>>>
> >>
> >>
>
>

Re: Compilation error: HBASE 0.98.4 with Snappy

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
Hi,

Thanks!  

A question:
If I run:
$  uname -m   
x86_64

Should I use " lib/native/Linux-amd64-64” or  "lib/native/x86_64”  in $HADOOP_HOME and $HBASE_HOME?

Arthur


On 27 Aug, 2014, at 8:10 am, Jean-Marc Spaggiari <je...@spaggiari.org> wrote:

> Ok.
> 
> This is the way the lib path is built:
> 
> JAVA_LIBRARY_PATH=$(append_path "$JAVA_LIBRARY_PATH"
> ${HBASE_HOME}/build/native/${JAVA_PLATFORM}/lib)
> 
> And JAVA_PLATFORM comes from JAVA_PLATFORM=`CLASSPATH=${CLASSPATH} ${JAVA}
> org.apache.hadoop.util.PlatformName | sed -e "s/ /_/g"`
> 
> You can double check it doing:
> 
> # Adjust to you java_home...
> export JAVA_HOME=/usr/local/jdk1.7.0_45/
> 
> export CLASSPATH=`bin/hbase classpath`
> $JAVA_HOME/bin/java org.apache.hadoop.util.PlatformName | sed -e "s/ /_/g"
> 
> Result for me is this: Linux-amd64-64. Might  be different for you.
> 
> Then you link the libs the way Alex said before:
> cd lib/native/Linux-amd64-64
> ln -s /home/hbase/snappy-1.0.5/.libs/libsnappy.so .
> ln -s /home/hbase/snappy-1.0.5/.libs/libsnappy.so.1 .
> 
> AND.....
> 
> The hadoop so too! And I think this is what's missing for you:
> ln -s /YOURHADOOPPATH/libhadoop.so .
> 
> Your folder should look like this:
> jmspaggi@node8:~/hbase-0.98.5-hadoop2/lib/native$ tree
> .
> └── Linux-amd64-64
>    ├── libhadoop.so
>    ├── libsnappy.so -> /home/hbase/snappy-1.0.5/.libs/libsnappy.so
>    └── libsnappy.so.1 -> /home/hbase/snappy-1.0.5/.libs/libsnappy.so.1
> 
> I copied libhadoop.so instead of doing a link because it was not available
> on this computer.
> 
> Then test it:
> jmspaggi@node8:~/hbase-0.98.5-hadoop2$ bin/hbase
> org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test snappy
> 2014-08-26 20:06:43,987 INFO  [main] Configuration.deprecation:
> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> 2014-08-26 20:06:44,831 INFO  [main] util.ChecksumType: Checksum using
> org.apache.hadoop.util.PureJavaCrc32
> 2014-08-26 20:06:44,832 INFO  [main] util.ChecksumType: Checksum can use
> org.apache.hadoop.util.PureJavaCrc32C
> 2014-08-26 20:06:45,125 INFO  [main] compress.CodecPool: Got brand-new
> compressor [.snappy]
> 2014-08-26 20:06:45,131 INFO  [main] compress.CodecPool: Got brand-new
> compressor [.snappy]
> 2014-08-26 20:06:45,254 INFO  [main] compress.CodecPool: Got brand-new
> decompressor [.snappy]
> SUCCESS
> 
> 
> Please let us know if it still doesn't work for you. Without libhadoop.so
> it doesn't work for me...
> jmspaggi@node8:~/hbase-0.98.5-hadoop2/lib/native$ rm
> Linux-amd64-64/libhadoop.so
> 
> jmspaggi@node8:~/hbase-0.98.5-hadoop2$ bin/hbase
> org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test snappy
> 2014-08-26 20:09:28,945 INFO  [main] Configuration.deprecation:
> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> 2014-08-26 20:09:29,460 WARN  [main] util.NativeCodeLoader: Unable to load
> native-hadoop library for your platform... using builtin-java classes where
> applicable
> 2014-08-26 20:09:29,775 INFO  [main] util.ChecksumType: Checksum using
> org.apache.hadoop.util.PureJavaCrc32
> 2014-08-26 20:09:29,776 INFO  [main] util.ChecksumType: Checksum can use
> org.apache.hadoop.util.PureJavaCrc32C
> Exception in thread "main" java.lang.UnsatisfiedLinkError:
> org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
>    at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
> Method)
> ...
> 
> 
> I did all of that using a brand new extracted
> hbase-0.98.5-hadoop2-bin.tar.gz file.
> 
> JM
> 
> 
> 2014-08-26 19:47 GMT-04:00 Arthur.hk.chan@gmail.com <
> arthur.hk.chan@gmail.com>:
> 
>> $ uname -m
>> x86_64
>> 
>> Arthur
>> 
>> On 27 Aug, 2014, at 7:45 am, Jean-Marc Spaggiari <je...@spaggiari.org>
>> wrote:
>> 
>>> Hi Arthur,
>>> 
>>> What uname -m gives you? you need to check that to create the right
>> folder
>>> under the lib directory.
>>> 
>>> JM
>>> 
>>> 
>>> 2014-08-26 19:43 GMT-04:00 Alex Kamil <al...@gmail.com>:
>>> 
>>>> Something like this worked for me
>>>> 1. get hbase binaries
>>>> 2. sudo yum install snappy snappy-devel
>>>> 3. ln -sf /usr/lib64/libsnappy.so
>>>> /var/lib/hadoop/lib/native/Linux-amd64-64/.
>>>> 4. ln -sf /usr/lib64/libsnappy.so
>>>> /var/lib/hbase/lib/native/Linux-amd64-64/.
>>>> 5. add snappy jar under $HADOOP_HOME/lib and $HBASE_HOME/lib
>>>> ref: https://issues.apache.org/jira/browse/PHOENIX-877
>>>> 
>>>> 
>>>> On Tue, Aug 26, 2014 at 7:25 PM, Arthur.hk.chan@gmail.com <
>>>> arthur.hk.chan@gmail.com> wrote:
>>>> 
>>>>> Hi,
>>>>> 
>>>>> I just tried three more steps but was not able to get thru.
>>>>> 
>>>>> 
>>>>> 1) copied  snappy files to $HBASE_HOME/lib
>>>>> $ cd $HBASE_HOME
>>>>> $ ll lib/*sna*
>>>>> -rw-r--r--. 1 hduser hadoop  11526 Aug 27 06:54
>>>>> lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
>>>>> -rw-rw-r--. 1 hduser hadoop 995968 Aug  3 18:43
>>>> lib/snappy-java-1.0.4.1.jar
>>>>> 
>>>>> ll lib/native/
>>>>> drwxrwxr-x. 4 hduser hadoop 4096 Aug 27 06:54 Linux-amd64-64
>>>>> 
>>>>> ll lib/native/Linux-amd64-64/
>>>>> total 18964
>>>>> lrwxrwxrwx. 1 hduser Hadoop      24 Aug 27 07:08 libhadoopsnappy.so ->
>>>>> libhadoopsnappy.so.0.0.1
>>>>> lrwxrwxrwx. 1 hduser Hadoop      24 Aug 27 07:08 libhadoopsnappy.so.0
>> ->
>>>>> libhadoopsnappy.so.0.0.1
>>>>> -rwxr-xr-x. 1 hduser Hadoop   54961 Aug 27 07:08
>> libhadoopsnappy.so.0.0.1
>>>>> lrwxrwxrwx. 1 hduser Hadoop      55 Aug 27 07:08 libjvm.so ->
>>>>> /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
>>>>> lrwxrwxrwx. 1 hduser Hadoop      25 Aug 27 07:08 libprotobuf-lite.so ->
>>>>> libprotobuf-lite.so.8.0.0
>>>>> lrwxrwxrwx. 1 hduser Hadoop      25 Aug 27 07:08 libprotobuf-lite.so.8
>> ->
>>>>> libprotobuf-lite.so.8.0.0
>>>>> -rwxr-xr-x. 1 hduser Hadoop  964689 Aug 27 07:08
>>>> libprotobuf-lite.so.8.0.0
>>>>> lrwxrwxrwx. 1 hduser Hadoop      20 Aug 27 07:08 libprotobuf.so ->
>>>>> libprotobuf.so.8.0.0
>>>>> lrwxrwxrwx. 1 hduser Hadoop      20 Aug 27 07:08 libprotobuf.so.8 ->
>>>>> libprotobuf.so.8.0.0
>>>>> -rwxr-xr-x. 1 hduser Hadoop 8300050 Aug 27 07:08 libprotobuf.so.8.0.0
>>>>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libprotoc.so ->
>>>>> libprotoc.so.8.0.0
>>>>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libprotoc.so.8 ->
>>>>> libprotoc.so.8.0.0
>>>>> -rwxr-xr-x. 1 hduser Hadoop 9935810 Aug 27 07:08 libprotoc.so.8.0.0
>>>>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libsnappy.so ->
>>>>> libsnappy.so.1.2.0
>>>>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libsnappy.so.1 ->
>>>>> libsnappy.so.1.2.0
>>>>> -rwxr-xr-x. 1 hduser Hadoop  147726 Aug 27 07:08 libsnappy.so.1.2.0
>>>>> drwxr-xr-x. 2 hduser Hadoop    4096 Aug 27 07:08 pkgconfig
>>>>> 
>>>>> 2)  $HBASE_HOME/conf/hbase-env.sh, added
>>>>> 
>>>>> ###
>>>>> export
>>>>> 
>>>> 
>> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
>>>>> export
>>>>> 
>>>> 
>> HBASE_LIBRARY_PATH=$HBASE_LIBRARY_PATH:$HBASE_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/:$HBASE_HOME/lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
>>>>> export CLASSPATH=$CLASSPATH:$HBASE_LIBRARY_PATH
>>>>> export HBASE_CLASSPATH=$HBASE_CLASSPATH:$HBASE_LIBRARY_PATH
>>>>> ###
>>>>> 
>>>>> 3) restart HBASE and tried again
>>>>> $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
>>>>> file:///tmp/snappy-test snappy
>>>>> 2014-08-27 07:16:09,490 INFO  [main] Configuration.deprecation:
>>>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>> SLF4J: Found binding in
>>>>> 
>>>> 
>> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>> SLF4J: Found binding in
>>>>> 
>>>> 
>> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>>> explanation.
>>>>> 2014-08-27 07:16:10,323 INFO  [main] util.ChecksumType: Checksum using
>>>>> org.apache.hadoop.util.PureJavaCrc32
>>>>> 2014-08-27 07:16:10,324 INFO  [main] util.ChecksumType: Checksum can
>> use
>>>>> org.apache.hadoop.util.PureJavaCrc32C
>>>>> Exception in thread "main" java.lang.RuntimeException: native snappy
>>>>> library not available: this version of libhadoop was built without
>> snappy
>>>>> support.
>>>>>       at
>>>>> 
>>>> 
>> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
>>>>>       at
>>>>> 
>>>> 
>> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>>>>>       at
>>>>> 
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>>>>>       at
>>>>> 
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>>>>>       at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
>>>>>       at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
>>>>>       at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
>>>>>       at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
>>>>>       at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
>>>>>       at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
>>>>>       at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
>>>>>       at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
>>>>>       at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
>>>>> 
>>>>> 
>>>>> Regards
>>>>> Arthur
>>>>> 
>>>>> 
>>>>> 
>>>>> On 27 Aug, 2014, at 6:27 am, Arthur.hk.chan@gmail.com <
>>>>> arthur.hk.chan@gmail.com> wrote:
>>>>> 
>>>>>> Hi Sean,
>>>>>> 
>>>>>> Thanks for your reply.
>>>>>> 
>>>>>> I tried the following tests
>>>>>> 
>>>>>> $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
>>>>> file:///tmp/snappy-test gz
>>>>>> 2014-08-26 23:06:17,778 INFO  [main] Configuration.deprecation:
>>>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>>> SLF4J: Found binding in
>>>>> 
>>>> 
>> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>> SLF4J: Found binding in
>>>>> 
>>>> 
>> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>>> explanation.
>>>>>> 2014-08-26 23:06:18,103 INFO  [main] util.ChecksumType: Checksum using
>>>>> org.apache.hadoop.util.PureJavaCrc32
>>>>>> 2014-08-26 23:06:18,104 INFO  [main] util.ChecksumType: Checksum can
>>>> use
>>>>> org.apache.hadoop.util.PureJavaCrc32C
>>>>>> 2014-08-26 23:06:18,260 INFO  [main] zlib.ZlibFactory: Successfully
>>>>> loaded & initialized native-zlib library
>>>>>> 2014-08-26 23:06:18,276 INFO  [main] compress.CodecPool: Got brand-new
>>>>> compressor [.gz]
>>>>>> 2014-08-26 23:06:18,280 INFO  [main] compress.CodecPool: Got brand-new
>>>>> compressor [.gz]
>>>>>> 2014-08-26 23:06:18,921 INFO  [main] compress.CodecPool: Got brand-new
>>>>> decompressor [.gz]
>>>>>> SUCCESS
>>>>>> 
>>>>>> 
>>>>>> $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
>>>>> file:///tmp/snappy-test snappy
>>>>>> 2014-08-26 23:07:08,246 INFO  [main] Configuration.deprecation:
>>>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>>> SLF4J: Found binding in
>>>>> 
>>>> 
>> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>> SLF4J: Found binding in
>>>>> 
>>>> 
>> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>>> explanation.
>>>>>> 2014-08-26 23:07:08,578 INFO  [main] util.ChecksumType: Checksum using
>>>>> org.apache.hadoop.util.PureJavaCrc32
>>>>>> 2014-08-26 23:07:08,579 INFO  [main] util.ChecksumType: Checksum can
>>>> use
>>>>> org.apache.hadoop.util.PureJavaCrc32C
>>>>>> Exception in thread "main" java.lang.RuntimeException: native snappy
>>>>> library not available: this version of libhadoop was built without
>> snappy
>>>>> support.
>>>>>>     at
>>>>> 
>>>> 
>> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
>>>>>>     at
>>>>> 
>>>> 
>> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>>>>>>     at
>>>>> 
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>>>>>>     at
>>>>> 
>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>>>>>>     at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
>>>>>>     at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
>>>>>>     at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
>>>>>>     at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
>>>>>>     at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
>>>>>>     at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
>>>>>>     at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
>>>>>>     at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
>>>>>>     at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
>>>>>> 
>>>>>> 
>>>>>> $ hbase shell
>>>>>> 2014-08-27 06:23:38,707 INFO  [main] Configuration.deprecation:
>>>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>>>>>> HBase Shell; enter 'help<RETURN>' for list of supported commands.
>>>>>> Type "exit<RETURN>" to leave the HBase Shell
>>>>>> Version 0.98.4-hadoop2, rUnknown, Sun Aug  3 23:45:36 HKT 2014
>>>>>> 
>>>>>> hbase(main):001:0>
>>>>>> hbase(main):001:0> create 'tsnappy', { NAME => 'f', COMPRESSION =>
>>>>> 'snappy'}
>>>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>>>> SLF4J: Found binding in
>>>>> 
>>>> 
>> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>> SLF4J: Found binding in
>>>>> 
>>>> 
>> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>>>> explanation.
>>>>>> 
>>>>>> ERROR: java.io.IOException: Compression algorithm 'snappy' previously
>>>>> failed test.
>>>>>>     at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.util.CompressionTest.testCompression(CompressionTest.java:85)
>>>>>>     at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1764)
>>>>>>     at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1757)
>>>>>>     at
>>>>> org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1739)
>>>>>>     at
>>>>> org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1774)
>>>>>>     at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40470)
>>>>>>     at
>>>> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2027)
>>>>>>     at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:98)
>>>>>>     at
>>>>> 
>>>> 
>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>>>     at
>>>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
>>>>>>     at
>>>>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>>>>>     at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>>>>>     at
>>>>> 
>>>> 
>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
>>>>>>     at
>>>>> 
>>>> 
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
>>>>>>     at java.lang.Thread.run(Thread.java:662)
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> Regards
>>>>>> Arthur
>>>>>> 
>>>>>> 
>>>>>> On 26 Aug, 2014, at 11:02 pm, Sean Busbey <bu...@cloudera.com>
>> wrote:
>>>>>> 
>>>>>>> Hi Arthur!
>>>>>>> 
>>>>>>> Our Snappy build instructions are currently out of date and I'm
>>>> working
>>>>> on updating them[1]. In short, I don't think there are any special
>> build
>>>>> steps for using snappy.
>>>>>>> 
>>>>>>> I'm still working out what needs to be included in our instructions
>>>> for
>>>>> local and cluster testing.
>>>>>>> 
>>>>>>> If you use the test for compression options, locally things will fail
>>>>> because the native hadoop libs won't be present:
>>>>>>> 
>>>>>>> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
>>>>> file:///tmp/snappy-test snappy
>>>>>>> (for comparison, replace "snappy" with "gz" and you will get a
>> warning
>>>>> about not having native libraries, but the test will succeed.)
>>>>>>> 
>>>>>>> I believe JM's suggestion is for you to copy the Hadoop native
>>>>> libraries into the local HBase lib/native directory, which would allow
>>>> the
>>>>> local test to pass. If you are running in a deployed Hadoop cluster, I
>>>>> would expect the necessary libraries to already be available to HBase.
>>>>>>> 
>>>>>>> [1]: https://issues.apache.org/jira/browse/HBASE-6189
>>>>>>> 
>>>>>>> -Sean
>>>>>>> 
>>>>>>> 
>>>>>>> On Tue, Aug 26, 2014 at 8:30 AM, Arthur.hk.chan@gmail.com <
>>>>> arthur.hk.chan@gmail.com> wrote:
>>>>>>> Hi JM
>>>>>>> 
>>>>>>> Below are my commands, tried two cases under same source code folder:
>>>>>>> a) compile with snappy parameters(failed),
>>>>>>> b) compile without snappy parameters (successful).
>>>>>>> 
>>>>>>> Regards
>>>>>>> Arthur
>>>>>>> 
>>>>>>> wget
>>>>> http://mirrors.devlib.org/apache/hbase/stable/hbase-0.98.4-src.tar.gz
>>>>>>> tar -vxf hbase-0.98.4-src.tar.gz
>>>>>>> mv hbase-0.98.4 hbase-0.98.4-src_snappy
>>>>>>> cd  hbase-0.98.4-src_snappy
>>>>>>> nano dev-support/generate-hadoopX-poms.sh
>>>>>>> (change  hbase_home=“/usr/local/hadoop/hbase-0.98.4-src_snappy”)
>>>>>>> 
>>>>>>> 
>>>>>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 0.98.4-hadoop2
>>>>>>> a) with snappy parameters
>>>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
>>>>> -Prelease,hadoop-snappy -Dhadoop-snappy.version=0.0.1-SNAPSHOT
>>>>>>> [INFO]
>>>>> 
>> ------------------------------------------------------------------------
>>>>>>> [INFO] Building HBase - Server 0.98.4-hadoop2
>>>>>>> [INFO]
>>>>> 
>> ------------------------------------------------------------------------
>>>>>>> [WARNING] The POM for
>>>>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT is missing, no
>>>>> dependency information available
>>>>>>> [INFO]
>>>>> 
>> ------------------------------------------------------------------------
>>>>>>> [INFO] Reactor Summary:
>>>>>>> [INFO]
>>>>>>> [INFO] HBase ............................................. SUCCESS
>>>>> [8.192s]
>>>>>>> [INFO] HBase - Common .................................... SUCCESS
>>>>> [5.638s]
>>>>>>> [INFO] HBase - Protocol .................................. SUCCESS
>>>>> [1.535s]
>>>>>>> [INFO] HBase - Client .................................... SUCCESS
>>>>> [1.206s]
>>>>>>> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
>>>>> [0.193s]
>>>>>>> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
>>>>> [0.798s]
>>>>>>> [INFO] HBase - Prefix Tree ............................... SUCCESS
>>>>> [0.438s]
>>>>>>> [INFO] HBase - Server .................................... FAILURE
>>>>> [0.234s]
>>>>>>> [INFO] HBase - Testing Util .............................. SKIPPED
>>>>>>> [INFO] HBase - Thrift .................................... SKIPPED
>>>>>>> [INFO] HBase - Shell ..................................... SKIPPED
>>>>>>> [INFO] HBase - Integration Tests ......................... SKIPPED
>>>>>>> [INFO] HBase - Examples .................................. SKIPPED
>>>>>>> [INFO] HBase - Assembly .................................. SKIPPED
>>>>>>> [INFO]
>>>>> 
>> ------------------------------------------------------------------------
>>>>>>> [INFO] BUILD FAILURE
>>>>>>> [INFO]
>>>>> 
>> ------------------------------------------------------------------------
>>>>>>> [INFO] Total time: 19.474s
>>>>>>> [INFO] Finished at: Tue Aug 26 21:21:13 HKT 2014
>>>>>>> [INFO] Final Memory: 51M/1100M
>>>>>>> [INFO]
>>>>> 
>> ------------------------------------------------------------------------
>>>>>>> [ERROR] Failed to execute goal on project hbase-server: Could not
>>>>> resolve dependencies for project
>>>>> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2: Failure to find
>>>>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
>>>>> http://maven.oschina.net/content/groups/public/ was cached in the
>> local
>>>>> repository, resolution will not be reattempted until the update
>> interval
>>>> of
>>>>> nexus-osc has elapsed or updates are forced -> [Help 1]
>>>>>>> [ERROR]
>>>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven with
>>>>> the -e switch.
>>>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug
>> logging.
>>>>>>> [ERROR]
>>>>>>> [ERROR] For more information about the errors and possible solutions,
>>>>> please read the following articles:
>>>>>>> [ERROR] [Help 1]
>>>>> 
>>>> 
>> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
>>>>>>> [ERROR]
>>>>>>> [ERROR] After correcting the problems, you can resume the build with
>>>>> the command
>>>>>>> [ERROR]   mvn <goals> -rf :hbase-server
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> b) try again, without snappy parameters
>>>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single -Prelease
>>>>>>> [INFO] Building tar:
>>>>> 
>>>> 
>> /edh/hadoop_all_sources/hbase-0.98.4-src_snappy/hbase-assembly/target/hbase-0.98.4-hadoop2-bin.tar.gz
>>>>>>> [INFO]
>>>>> 
>> ------------------------------------------------------------------------
>>>>>>> [INFO] Reactor Summary:
>>>>>>> [INFO]
>>>>>>> [INFO] HBase ............................................. SUCCESS
>>>>> [3.290s]
>>>>>>> [INFO] HBase - Common .................................... SUCCESS
>>>>> [3.119s]
>>>>>>> [INFO] HBase - Protocol .................................. SUCCESS
>>>>> [0.972s]
>>>>>>> [INFO] HBase - Client .................................... SUCCESS
>>>>> [0.920s]
>>>>>>> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
>>>>> [0.167s]
>>>>>>> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
>>>>> [0.504s]
>>>>>>> [INFO] HBase - Prefix Tree ............................... SUCCESS
>>>>> [0.382s]
>>>>>>> [INFO] HBase - Server .................................... SUCCESS
>>>>> [4.790s]
>>>>>>> [INFO] HBase - Testing Util .............................. SUCCESS
>>>>> [0.598s]
>>>>>>> [INFO] HBase - Thrift .................................... SUCCESS
>>>>> [1.536s]
>>>>>>> [INFO] HBase - Shell ..................................... SUCCESS
>>>>> [0.369s]
>>>>>>> [INFO] HBase - Integration Tests ......................... SUCCESS
>>>>> [0.443s]
>>>>>>> [INFO] HBase - Examples .................................. SUCCESS
>>>>> [0.459s]
>>>>>>> [INFO] HBase - Assembly .................................. SUCCESS
>>>>> [13.240s]
>>>>>>> [INFO]
>>>>> 
>> ------------------------------------------------------------------------
>>>>>>> [INFO] BUILD SUCCESS
>>>>>>> [INFO]
>>>>> 
>> ------------------------------------------------------------------------
>>>>>>> [INFO] Total time: 31.408s
>>>>>>> [INFO] Finished at: Tue Aug 26 21:22:50 HKT 2014
>>>>>>> [INFO] Final Memory: 57M/1627M
>>>>>>> [INFO]
>>>>> 
>> ------------------------------------------------------------------------
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> On 26 Aug, 2014, at 8:52 pm, Jean-Marc Spaggiari <
>>>>> jean-marc@spaggiari.org> wrote:
>>>>>>> 
>>>>>>>> Hi Arthur,
>>>>>>>> 
>>>>>>>> How have you extracted HBase source and what command do you run to
>>>>> build? I
>>>>>>>> will do the same here locally so I can provide you the exact step to
>>>>>>>> complete.
>>>>>>>> 
>>>>>>>> JM
>>>>>>>> 
>>>>>>>> 
>>>>>>>> 2014-08-26 8:42 GMT-04:00 Arthur.hk.chan@gmail.com <
>>>>> arthur.hk.chan@gmail.com
>>>>>>>>> :
>>>>>>>> 
>>>>>>>>> Hi JM
>>>>>>>>> 
>>>>>>>>> Not too sure what you mean, do you mean I should create a new
>>>> folder
>>>>> in my
>>>>>>>>> HBASE_SRC named lib/native/Linux-x86 and copy these files to this
>>>>> folder
>>>>>>>>> then try to compile it again?
>>>>>>>>> 
>>>>>>>>> Regards
>>>>>>>>> ARthur
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> On 26 Aug, 2014, at 8:17 pm, Jean-Marc Spaggiari <
>>>>> jean-marc@spaggiari.org>
>>>>>>>>> wrote:
>>>>>>>>> 
>>>>>>>>>> Hi Arthur,
>>>>>>>>>> 
>>>>>>>>>> Almost done! You now need to copy them on the HBase folder.
>>>>>>>>>> 
>>>>>>>>>> hbase@hbasetest1:~/hbase-0.98.2-hadoop2/lib$ tree | grep -v .jar
>>>> |
>>>>> grep
>>>>>>>>> -v
>>>>>>>>>> .rb
>>>>>>>>>> .
>>>>>>>>>> ├── native
>>>>>>>>>> │   └── Linux-x86
>>>>>>>>>> │       ├── libsnappy.a
>>>>>>>>>> │       ├── libsnappy.la
>>>>>>>>>> │       ├── libsnappy.so
>>>>>>>>>> │       ├── libsnappy.so.1
>>>>>>>>>> │       └── libsnappy.so.1.2.0
>>>>>>>>>> 
>>>>>>>>>> I don't have any hadoop-snappy lib in my hbase folder and it works
>>>>> very
>>>>>>>>>> well with Snappy for me...
>>>>>>>>>> 
>>>>>>>>>> JM
>>>>>>>>>> 
>>>>>>>>>> 2014-08-26 8:09 GMT-04:00 Arthur.hk.chan@gmail.com <
>>>>>>>>> arthur.hk.chan@gmail.com
>>>>>>>>>>> :
>>>>>>>>>> 
>>>>>>>>>>> Hi JM,
>>>>>>>>>>> 
>>>>>>>>>>> Below are my steps to install snappy lib, do I miss something?
>>>>>>>>>>> 
>>>>>>>>>>> Regards
>>>>>>>>>>> Arthur
>>>>>>>>>>> 
>>>>>>>>>>> wget https://snappy.googlecode.com/files/snappy-1.1.1.tar.gz
>>>>>>>>>>> tar -vxf snappy-1.1.1.tar.gz
>>>>>>>>>>> cd snappy-1.1.1
>>>>>>>>>>> ./configure
>>>>>>>>>>> make
>>>>>>>>>>> make install
>>>>>>>>>>>     make[1]: Entering directory
>>>>>>>>> `/edh/hadoop_all_sources/snappy-1.1.1'
>>>>>>>>>>>     test -z "/usr/local/lib" || /bin/mkdir -p "/usr/local/lib"
>>>>>>>>>>>      /bin/sh ./libtool   --mode=install /usr/bin/install -c
>>>>>>>>>>> libsnappy.la '/usr/local/lib'
>>>>>>>>>>>     libtool: install: /usr/bin/install -c
>>>>> .libs/libsnappy.so.1.2.0
>>>>>>>>>>> /usr/local/lib/libsnappy.so.1.2.0
>>>>>>>>>>>     libtool: install: (cd /usr/local/lib && { ln -s -f
>>>>>>>>>>> libsnappy.so.1.2.0 libsnappy.so.1 || { rm -f libsnappy.so.1 && ln
>>>>> -s
>>>>>>>>>>> libsnappy.so.1.2.0 libsnappy.so.1; }; })
>>>>>>>>>>>     libtool: install: (cd /usr/local/lib && { ln -s -f
>>>>>>>>>>> libsnappy.so.1.2.0 libsnappy.so || { rm -f libsnappy.so && ln -s
>>>>>>>>>>> libsnappy.so.1.2.0 libsnappy.so; }; })
>>>>>>>>>>>     libtool: install: /usr/bin/install -c .libs/libsnappy.lai
>>>>>>>>>>> /usr/local/lib/libsnappy.la
>>>>>>>>>>>     libtool: install: /usr/bin/install -c .libs/libsnappy.a
>>>>>>>>>>> /usr/local/lib/libsnappy.a
>>>>>>>>>>>     libtool: install: chmod 644 /usr/local/lib/libsnappy.a
>>>>>>>>>>>     libtool: install: ranlib /usr/local/lib/libsnappy.a
>>>>>>>>>>>     libtool: finish:
>>>>>>>>>>> 
>>>>>>>>> 
>>>>> 
>>>> 
>> PATH="/edh/hadoop/spark/bin:/edh/hadoop/hbase/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/yarn/hadoop/bin:/edh/hadoop/yarn/hadoop/sbin:/usr/lib64/qt-3.3/bin:/opt/apache-maven-3.1.1/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/hive//bin:/usr/lib/jvm/jdk1.6.0_45//bin:/root/bin:/sbin"
>>>>>>>>>>> ldconfig -n /usr/local/lib
>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>> ----------------------------------------------------------------------
>>>>>>>>>>>     Libraries have been installed in:
>>>>>>>>>>>     /usr/local/lib
>>>>>>>>>>>     If you ever happen to want to link against installed
>>>>> libraries
>>>>>>>>>>>     in a given directory, LIBDIR, you must either use libtool,
>>>>> and
>>>>>>>>>>>     specify the full pathname of the library, or use the
>>>>> `-LLIBDIR'
>>>>>>>>>>>     flag during linking and do at least one of the following:
>>>>>>>>>>>     - add LIBDIR to the `LD_LIBRARY_PATH' environment variable
>>>>>>>>>>>     during execution
>>>>>>>>>>>     - add LIBDIR to the `LD_RUN_PATH' environment variable
>>>>>>>>>>>     during linking
>>>>>>>>>>>     - use the `-Wl,-rpath -Wl,LIBDIR' linker flag
>>>>>>>>>>>     - have your system administrator add LIBDIR to
>>>>> `/etc/ld.so.conf'
>>>>>>>>>>>     See any operating system documentation about shared
>>>>> libraries for
>>>>>>>>>>>     more information, such as the ld(1) and ld.so(8) manual
>>>>> pages.
>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>> ----------------------------------------------------------------------
>>>>>>>>>>>     test -z "/usr/local/share/doc/snappy" || /bin/mkdir -p
>>>>>>>>>>> "/usr/local/share/doc/snappy"
>>>>>>>>>>>      /usr/bin/install -c -m 644 ChangeLog COPYING INSTALL NEWS
>>>>> README
>>>>>>>>>>> format_description.txt framing_format.txt
>>>>> '/usr/local/share/doc/snappy'
>>>>>>>>>>>     test -z "/usr/local/include" || /bin/mkdir -p
>>>>>>>>> "/usr/local/include"
>>>>>>>>>>>      /usr/bin/install -c -m 644 snappy.h snappy-sinksource.h
>>>>>>>>>>> snappy-stubs-public.h snappy-c.h '/usr/local/include'
>>>>>>>>>>>     make[1]: Leaving directory
>>>>> `/edh/hadoop_all_sources/snappy-1.1.1'
>>>>>>>>>>> 
>>>>>>>>>>> ll /usr/local/lib
>>>>>>>>>>>     -rw-r--r--. 1 root root   233554 Aug 20 00:14 libsnappy.a
>>>>>>>>>>>     -rwxr-xr-x. 1 root root      953 Aug 20 00:14 libsnappy.la
>>>>>>>>>>>     lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so
>>>> ->
>>>>>>>>>>> libsnappy.so.1.2.0
>>>>>>>>>>>     lrwxrwxrwx. 1 root root       18 Aug 20 00:14
>>>> libsnappy.so.1
>>>>> ->
>>>>>>>>>>> libsnappy.so.1.2.0
>>>>>>>>>>>     -rwxr-xr-x. 1 root root   147726 Aug 20 00:14
>>>>> libsnappy.so.1.2.0
>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>>>>>> On 26 Aug, 2014, at 7:38 pm, Jean-Marc Spaggiari <
>>>>>>>>> jean-marc@spaggiari.org>
>>>>>>>>>>> wrote:
>>>>>>>>>>> 
>>>>>>>>>>>> Hi Arthur,
>>>>>>>>>>>> 
>>>>>>>>>>>> Do you have snappy libs installed and configured? HBase doesn't
>>>>> come
>>>>>>>>> with
>>>>>>>>>>>> Snappy. So yo need to have it first.
>>>>>>>>>>>> 
>>>>>>>>>>>> Shameless plug:
>>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>>>> 
>>>>> 
>>>> 
>> http://www.spaggiari.org/index.php/hbase/how-to-install-snappy-with-1#.U_xxSqdZuZY
>>>>>>>>>>>> 
>>>>>>>>>>>> This is for 0.96 but should be very similar for 0.98. I will try
>>>>> it
>>>>>>>>> soon
>>>>>>>>>>>> and post and update, but keep us posted here so we can support
>>>>> you...
>>>>>>>>>>>> 
>>>>>>>>>>>> JM
>>>>>>>>>>>> 
>>>>>>>>>>>> 
>>>>>>>>>>>> 2014-08-26 7:34 GMT-04:00 Arthur.hk.chan@gmail.com <
>>>>>>>>>>> arthur.hk.chan@gmail.com
>>>>>>>>>>>>> :
>>>>>>>>>>>> 
>>>>>>>>>>>>> Hi,
>>>>>>>>>>>>> 
>>>>>>>>>>>>> I need to install snappy to HBase 0.98.4.  (my Hadoop version
>>>> is
>>>>>>>>> 2.4.1)
>>>>>>>>>>>>> 
>>>>>>>>>>>>> Can you please advise what would be wrong?  Should my pom.xml
>>>> be
>>>>>>>>>>> incorrect
>>>>>>>>>>>>> and missing something?
>>>>>>>>>>>>> 
>>>>>>>>>>>>> Regards
>>>>>>>>>>>>> Arthur
>>>>>>>>>>>>> 
>>>>>>>>>>>>> 
>>>>>>>>>>>>> Below are my commands:
>>>>>>>>>>>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4
>>>>> 0.98.4-hadoop2
>>>>>>>>>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
>>>>>>>>>>>>> -Prelease,hadoop-snappy
>>>>>>>>>>>>> 
>>>>>>>>>>>>> Iog:
>>>>>>>>>>>>> [INFO]
>>>>>>>>>>>>> 
>>>>>>>>> 
>>>>> 
>> ------------------------------------------------------------------------
>>>>>>>>>>>>> [INFO] Building HBase - Server 0.98.4-hadoop2
>>>>>>>>>>>>> [INFO]
>>>>>>>>>>>>> 
>>>>>>>>> 
>>>>> 
>> ------------------------------------------------------------------------
>>>>>>>>>>>>> [WARNING] The POM for
>>>>>>>>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT
>>>>>>>>>>>>> is missing, no dependency information available
>>>>>>>>>>>>> [INFO]
>>>>>>>>>>>>> 
>>>>>>>>> 
>>>>> 
>> ------------------------------------------------------------------------
>>>>>>>>>>>>> [INFO] Reactor Summary:
>>>>>>>>>>>>> [INFO]
>>>>>>>>>>>>> [INFO] HBase .............................................
>>>>> SUCCESS
>>>>>>>>>>> [3.129s]
>>>>>>>>>>>>> [INFO] HBase - Common ....................................
>>>>> SUCCESS
>>>>>>>>>>> [3.105s]
>>>>>>>>>>>>> [INFO] HBase - Protocol ..................................
>>>>> SUCCESS
>>>>>>>>>>> [0.976s]
>>>>>>>>>>>>> [INFO] HBase - Client ....................................
>>>>> SUCCESS
>>>>>>>>>>> [0.925s]
>>>>>>>>>>>>> [INFO] HBase - Hadoop Compatibility ......................
>>>>> SUCCESS
>>>>>>>>>>> [0.183s]
>>>>>>>>>>>>> [INFO] HBase - Hadoop Two Compatibility ..................
>>>>> SUCCESS
>>>>>>>>>>> [0.497s]
>>>>>>>>>>>>> [INFO] HBase - Prefix Tree ...............................
>>>>> SUCCESS
>>>>>>>>>>> [0.407s]
>>>>>>>>>>>>> [INFO] HBase - Server ....................................
>>>>> FAILURE
>>>>>>>>>>> [0.103s]
>>>>>>>>>>>>> [INFO] HBase - Testing Util ..............................
>>>>> SKIPPED
>>>>>>>>>>>>> [INFO] HBase - Thrift ....................................
>>>>> SKIPPED
>>>>>>>>>>>>> [INFO] HBase - Shell .....................................
>>>>> SKIPPED
>>>>>>>>>>>>> [INFO] HBase - Integration Tests .........................
>>>>> SKIPPED
>>>>>>>>>>>>> [INFO] HBase - Examples ..................................
>>>>> SKIPPED
>>>>>>>>>>>>> [INFO] HBase - Assembly ..................................
>>>>> SKIPPED
>>>>>>>>>>>>> [INFO]
>>>>>>>>>>>>> 
>>>>>>>>> 
>>>>> 
>> ------------------------------------------------------------------------
>>>>>>>>>>>>> [INFO] BUILD FAILURE
>>>>>>>>>>>>> [INFO]
>>>>>>>>>>>>> 
>>>>>>>>> 
>>>>> 
>> ------------------------------------------------------------------------
>>>>>>>>>>>>> [INFO] Total time: 9.939s
>>>>>>>>>>>>> [INFO] Finished at: Tue Aug 26 19:23:14 HKT 2014
>>>>>>>>>>>>> [INFO] Final Memory: 61M/2921M
>>>>>>>>>>>>> [INFO]
>>>>>>>>>>>>> 
>>>>>>>>> 
>>>>> 
>> ------------------------------------------------------------------------
>>>>>>>>>>>>> [ERROR] Failed to execute goal on project hbase-server: Could
>>>> not
>>>>>>>>>>> resolve
>>>>>>>>>>>>> dependencies for project
>>>>>>>>>>> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2:
>>>>>>>>>>>>> Failure to find
>>>>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
>>>>>>>>>>>>> http://maven.oschina.net/content/groups/public/ was cached in
>>>>> the
>>>>>>>>> local
>>>>>>>>>>>>> repository, resolution will not be reattempted until the update
>>>>>>>>>>> interval of
>>>>>>>>>>>>> nexus-osc has elapsed or updates are forced -> [Help 1]
>>>>>>>>>>>>> [ERROR]
>>>>>>>>>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven
>>>>> with
>>>>>>>>> the
>>>>>>>>>>>>> -e switch.
>>>>>>>>>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug
>>>>> logging.
>>>>>>>>>>>>> [ERROR]
>>>>>>>>>>>>> [ERROR] For more information about the errors and possible
>>>>> solutions,
>>>>>>>>>>>>> please read the following articles:
>>>>>>>>>>>>> [ERROR] [Help 1]
>>>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>>>> 
>>>>> 
>>>> 
>> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
>>>>>>>>>>>>> [ERROR]
>>>>>>>>>>>>> [ERROR] After correcting the problems, you can resume the build
>>>>> with
>>>>>>>>> the
>>>>>>>>>>>>> command
>>>>>>>>>>>>> [ERROR]   mvn <goals> -rf :hbase-server
>>>>>>>>>>>>> 
>>>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>>>> --
>>>>>>> Sean
>>>>>> 
>>>>> 
>>>>> 
>>>> 
>> 
>> 


Re: Compilation error: HBASE 0.98.4 with Snappy

Posted by Jean-Marc Spaggiari <je...@spaggiari.org>.
Ok.

This is the way the lib path is built:

JAVA_LIBRARY_PATH=$(append_path "$JAVA_LIBRARY_PATH"
${HBASE_HOME}/build/native/${JAVA_PLATFORM}/lib)

And JAVA_PLATFORM comes from JAVA_PLATFORM=`CLASSPATH=${CLASSPATH} ${JAVA}
org.apache.hadoop.util.PlatformName | sed -e "s/ /_/g"`

You can double check it doing:

# Adjust to you java_home...
export JAVA_HOME=/usr/local/jdk1.7.0_45/

export CLASSPATH=`bin/hbase classpath`
$JAVA_HOME/bin/java org.apache.hadoop.util.PlatformName | sed -e "s/ /_/g"

Result for me is this: Linux-amd64-64. Might  be different for you.

Then you link the libs the way Alex said before:
cd lib/native/Linux-amd64-64
ln -s /home/hbase/snappy-1.0.5/.libs/libsnappy.so .
ln -s /home/hbase/snappy-1.0.5/.libs/libsnappy.so.1 .

AND.....

The hadoop so too! And I think this is what's missing for you:
ln -s /YOURHADOOPPATH/libhadoop.so .

Your folder should look like this:
jmspaggi@node8:~/hbase-0.98.5-hadoop2/lib/native$ tree
.
└── Linux-amd64-64
    ├── libhadoop.so
    ├── libsnappy.so -> /home/hbase/snappy-1.0.5/.libs/libsnappy.so
    └── libsnappy.so.1 -> /home/hbase/snappy-1.0.5/.libs/libsnappy.so.1

I copied libhadoop.so instead of doing a link because it was not available
on this computer.

Then test it:
jmspaggi@node8:~/hbase-0.98.5-hadoop2$ bin/hbase
org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test snappy
2014-08-26 20:06:43,987 INFO  [main] Configuration.deprecation:
hadoop.native.lib is deprecated. Instead, use io.native.lib.available
2014-08-26 20:06:44,831 INFO  [main] util.ChecksumType: Checksum using
org.apache.hadoop.util.PureJavaCrc32
2014-08-26 20:06:44,832 INFO  [main] util.ChecksumType: Checksum can use
org.apache.hadoop.util.PureJavaCrc32C
2014-08-26 20:06:45,125 INFO  [main] compress.CodecPool: Got brand-new
compressor [.snappy]
2014-08-26 20:06:45,131 INFO  [main] compress.CodecPool: Got brand-new
compressor [.snappy]
2014-08-26 20:06:45,254 INFO  [main] compress.CodecPool: Got brand-new
decompressor [.snappy]
SUCCESS


Please let us know if it still doesn't work for you. Without libhadoop.so
it doesn't work for me...
jmspaggi@node8:~/hbase-0.98.5-hadoop2/lib/native$ rm
Linux-amd64-64/libhadoop.so

jmspaggi@node8:~/hbase-0.98.5-hadoop2$ bin/hbase
org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test snappy
2014-08-26 20:09:28,945 INFO  [main] Configuration.deprecation:
hadoop.native.lib is deprecated. Instead, use io.native.lib.available
2014-08-26 20:09:29,460 WARN  [main] util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes where
applicable
2014-08-26 20:09:29,775 INFO  [main] util.ChecksumType: Checksum using
org.apache.hadoop.util.PureJavaCrc32
2014-08-26 20:09:29,776 INFO  [main] util.ChecksumType: Checksum can use
org.apache.hadoop.util.PureJavaCrc32C
Exception in thread "main" java.lang.UnsatisfiedLinkError:
org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z
    at org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy(Native
Method)
...


I did all of that using a brand new extracted
hbase-0.98.5-hadoop2-bin.tar.gz file.

JM


2014-08-26 19:47 GMT-04:00 Arthur.hk.chan@gmail.com <
arthur.hk.chan@gmail.com>:

> $ uname -m
> x86_64
>
> Arthur
>
> On 27 Aug, 2014, at 7:45 am, Jean-Marc Spaggiari <je...@spaggiari.org>
> wrote:
>
> > Hi Arthur,
> >
> > What uname -m gives you? you need to check that to create the right
> folder
> > under the lib directory.
> >
> > JM
> >
> >
> > 2014-08-26 19:43 GMT-04:00 Alex Kamil <al...@gmail.com>:
> >
> >> Something like this worked for me
> >> 1. get hbase binaries
> >> 2. sudo yum install snappy snappy-devel
> >> 3. ln -sf /usr/lib64/libsnappy.so
> >> /var/lib/hadoop/lib/native/Linux-amd64-64/.
> >> 4. ln -sf /usr/lib64/libsnappy.so
> >> /var/lib/hbase/lib/native/Linux-amd64-64/.
> >> 5. add snappy jar under $HADOOP_HOME/lib and $HBASE_HOME/lib
> >> ref: https://issues.apache.org/jira/browse/PHOENIX-877
> >>
> >>
> >> On Tue, Aug 26, 2014 at 7:25 PM, Arthur.hk.chan@gmail.com <
> >> arthur.hk.chan@gmail.com> wrote:
> >>
> >>> Hi,
> >>>
> >>> I just tried three more steps but was not able to get thru.
> >>>
> >>>
> >>> 1) copied  snappy files to $HBASE_HOME/lib
> >>> $ cd $HBASE_HOME
> >>> $ ll lib/*sna*
> >>> -rw-r--r--. 1 hduser hadoop  11526 Aug 27 06:54
> >>> lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
> >>> -rw-rw-r--. 1 hduser hadoop 995968 Aug  3 18:43
> >> lib/snappy-java-1.0.4.1.jar
> >>>
> >>> ll lib/native/
> >>> drwxrwxr-x. 4 hduser hadoop 4096 Aug 27 06:54 Linux-amd64-64
> >>>
> >>> ll lib/native/Linux-amd64-64/
> >>> total 18964
> >>> lrwxrwxrwx. 1 hduser Hadoop      24 Aug 27 07:08 libhadoopsnappy.so ->
> >>> libhadoopsnappy.so.0.0.1
> >>> lrwxrwxrwx. 1 hduser Hadoop      24 Aug 27 07:08 libhadoopsnappy.so.0
> ->
> >>> libhadoopsnappy.so.0.0.1
> >>> -rwxr-xr-x. 1 hduser Hadoop   54961 Aug 27 07:08
> libhadoopsnappy.so.0.0.1
> >>> lrwxrwxrwx. 1 hduser Hadoop      55 Aug 27 07:08 libjvm.so ->
> >>> /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
> >>> lrwxrwxrwx. 1 hduser Hadoop      25 Aug 27 07:08 libprotobuf-lite.so ->
> >>> libprotobuf-lite.so.8.0.0
> >>> lrwxrwxrwx. 1 hduser Hadoop      25 Aug 27 07:08 libprotobuf-lite.so.8
> ->
> >>> libprotobuf-lite.so.8.0.0
> >>> -rwxr-xr-x. 1 hduser Hadoop  964689 Aug 27 07:08
> >> libprotobuf-lite.so.8.0.0
> >>> lrwxrwxrwx. 1 hduser Hadoop      20 Aug 27 07:08 libprotobuf.so ->
> >>> libprotobuf.so.8.0.0
> >>> lrwxrwxrwx. 1 hduser Hadoop      20 Aug 27 07:08 libprotobuf.so.8 ->
> >>> libprotobuf.so.8.0.0
> >>> -rwxr-xr-x. 1 hduser Hadoop 8300050 Aug 27 07:08 libprotobuf.so.8.0.0
> >>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libprotoc.so ->
> >>> libprotoc.so.8.0.0
> >>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libprotoc.so.8 ->
> >>> libprotoc.so.8.0.0
> >>> -rwxr-xr-x. 1 hduser Hadoop 9935810 Aug 27 07:08 libprotoc.so.8.0.0
> >>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libsnappy.so ->
> >>> libsnappy.so.1.2.0
> >>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libsnappy.so.1 ->
> >>> libsnappy.so.1.2.0
> >>> -rwxr-xr-x. 1 hduser Hadoop  147726 Aug 27 07:08 libsnappy.so.1.2.0
> >>> drwxr-xr-x. 2 hduser Hadoop    4096 Aug 27 07:08 pkgconfig
> >>>
> >>> 2)  $HBASE_HOME/conf/hbase-env.sh, added
> >>>
> >>> ###
> >>> export
> >>>
> >>
> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
> >>> export
> >>>
> >>
> HBASE_LIBRARY_PATH=$HBASE_LIBRARY_PATH:$HBASE_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/:$HBASE_HOME/lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
> >>> export CLASSPATH=$CLASSPATH:$HBASE_LIBRARY_PATH
> >>> export HBASE_CLASSPATH=$HBASE_CLASSPATH:$HBASE_LIBRARY_PATH
> >>> ###
> >>>
> >>> 3) restart HBASE and tried again
> >>> $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> >>> file:///tmp/snappy-test snappy
> >>> 2014-08-27 07:16:09,490 INFO  [main] Configuration.deprecation:
> >>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> >>> SLF4J: Class path contains multiple SLF4J bindings.
> >>> SLF4J: Found binding in
> >>>
> >>
> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>> SLF4J: Found binding in
> >>>
> >>
> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> >>> explanation.
> >>> 2014-08-27 07:16:10,323 INFO  [main] util.ChecksumType: Checksum using
> >>> org.apache.hadoop.util.PureJavaCrc32
> >>> 2014-08-27 07:16:10,324 INFO  [main] util.ChecksumType: Checksum can
> use
> >>> org.apache.hadoop.util.PureJavaCrc32C
> >>> Exception in thread "main" java.lang.RuntimeException: native snappy
> >>> library not available: this version of libhadoop was built without
> snappy
> >>> support.
> >>>        at
> >>>
> >>
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
> >>>        at
> >>>
> >>
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
> >>>        at
> >>>
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
> >>>        at
> >>>
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
> >>>        at
> >>>
> >>
> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
> >>>        at
> >>>
> >>
> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
> >>>        at
> >>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
> >>>        at
> >>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
> >>>        at
> >>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
> >>>        at
> >>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
> >>>        at
> >>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
> >>>        at
> >>>
> >>
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
> >>>        at
> >>>
> >>
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
> >>>
> >>>
> >>> Regards
> >>> Arthur
> >>>
> >>>
> >>>
> >>> On 27 Aug, 2014, at 6:27 am, Arthur.hk.chan@gmail.com <
> >>> arthur.hk.chan@gmail.com> wrote:
> >>>
> >>>> Hi Sean,
> >>>>
> >>>> Thanks for your reply.
> >>>>
> >>>> I tried the following tests
> >>>>
> >>>> $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> >>> file:///tmp/snappy-test gz
> >>>> 2014-08-26 23:06:17,778 INFO  [main] Configuration.deprecation:
> >>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> >>>> SLF4J: Class path contains multiple SLF4J bindings.
> >>>> SLF4J: Found binding in
> >>>
> >>
> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>> SLF4J: Found binding in
> >>>
> >>
> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> >>> explanation.
> >>>> 2014-08-26 23:06:18,103 INFO  [main] util.ChecksumType: Checksum using
> >>> org.apache.hadoop.util.PureJavaCrc32
> >>>> 2014-08-26 23:06:18,104 INFO  [main] util.ChecksumType: Checksum can
> >> use
> >>> org.apache.hadoop.util.PureJavaCrc32C
> >>>> 2014-08-26 23:06:18,260 INFO  [main] zlib.ZlibFactory: Successfully
> >>> loaded & initialized native-zlib library
> >>>> 2014-08-26 23:06:18,276 INFO  [main] compress.CodecPool: Got brand-new
> >>> compressor [.gz]
> >>>> 2014-08-26 23:06:18,280 INFO  [main] compress.CodecPool: Got brand-new
> >>> compressor [.gz]
> >>>> 2014-08-26 23:06:18,921 INFO  [main] compress.CodecPool: Got brand-new
> >>> decompressor [.gz]
> >>>> SUCCESS
> >>>>
> >>>>
> >>>> $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> >>> file:///tmp/snappy-test snappy
> >>>> 2014-08-26 23:07:08,246 INFO  [main] Configuration.deprecation:
> >>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> >>>> SLF4J: Class path contains multiple SLF4J bindings.
> >>>> SLF4J: Found binding in
> >>>
> >>
> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>> SLF4J: Found binding in
> >>>
> >>
> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> >>> explanation.
> >>>> 2014-08-26 23:07:08,578 INFO  [main] util.ChecksumType: Checksum using
> >>> org.apache.hadoop.util.PureJavaCrc32
> >>>> 2014-08-26 23:07:08,579 INFO  [main] util.ChecksumType: Checksum can
> >> use
> >>> org.apache.hadoop.util.PureJavaCrc32C
> >>>> Exception in thread "main" java.lang.RuntimeException: native snappy
> >>> library not available: this version of libhadoop was built without
> snappy
> >>> support.
> >>>>      at
> >>>
> >>
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
> >>>>      at
> >>>
> >>
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
> >>>>      at
> >>>
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
> >>>>      at
> >>>
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
> >>>>      at
> >>>
> >>
> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
> >>>>      at
> >>>
> >>
> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
> >>>>      at
> >>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
> >>>>      at
> >>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
> >>>>      at
> >>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
> >>>>      at
> >>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
> >>>>      at
> >>>
> >>
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
> >>>>      at
> >>>
> >>
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
> >>>>      at
> >>>
> >>
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
> >>>>
> >>>>
> >>>> $ hbase shell
> >>>> 2014-08-27 06:23:38,707 INFO  [main] Configuration.deprecation:
> >>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> >>>> HBase Shell; enter 'help<RETURN>' for list of supported commands.
> >>>> Type "exit<RETURN>" to leave the HBase Shell
> >>>> Version 0.98.4-hadoop2, rUnknown, Sun Aug  3 23:45:36 HKT 2014
> >>>>
> >>>> hbase(main):001:0>
> >>>> hbase(main):001:0> create 'tsnappy', { NAME => 'f', COMPRESSION =>
> >>> 'snappy'}
> >>>> SLF4J: Class path contains multiple SLF4J bindings.
> >>>> SLF4J: Found binding in
> >>>
> >>
> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>> SLF4J: Found binding in
> >>>
> >>
> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> >>> explanation.
> >>>>
> >>>> ERROR: java.io.IOException: Compression algorithm 'snappy' previously
> >>> failed test.
> >>>>      at
> >>>
> >>
> org.apache.hadoop.hbase.util.CompressionTest.testCompression(CompressionTest.java:85)
> >>>>      at
> >>>
> >>
> org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1764)
> >>>>      at
> >>>
> >>
> org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1757)
> >>>>      at
> >>> org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1739)
> >>>>      at
> >>> org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1774)
> >>>>      at
> >>>
> >>
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40470)
> >>>>      at
> >> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2027)
> >>>>      at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:98)
> >>>>      at
> >>>
> >>
> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
> >>>>      at
> >>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
> >>>>      at
> >>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> >>>>      at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> >>>>      at
> >>>
> >>
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
> >>>>      at
> >>>
> >>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
> >>>>      at java.lang.Thread.run(Thread.java:662)
> >>>>
> >>>>
> >>>>
> >>>>
> >>>> Regards
> >>>> Arthur
> >>>>
> >>>>
> >>>> On 26 Aug, 2014, at 11:02 pm, Sean Busbey <bu...@cloudera.com>
> wrote:
> >>>>
> >>>>> Hi Arthur!
> >>>>>
> >>>>> Our Snappy build instructions are currently out of date and I'm
> >> working
> >>> on updating them[1]. In short, I don't think there are any special
> build
> >>> steps for using snappy.
> >>>>>
> >>>>> I'm still working out what needs to be included in our instructions
> >> for
> >>> local and cluster testing.
> >>>>>
> >>>>> If you use the test for compression options, locally things will fail
> >>> because the native hadoop libs won't be present:
> >>>>>
> >>>>> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> >>> file:///tmp/snappy-test snappy
> >>>>> (for comparison, replace "snappy" with "gz" and you will get a
> warning
> >>> about not having native libraries, but the test will succeed.)
> >>>>>
> >>>>> I believe JM's suggestion is for you to copy the Hadoop native
> >>> libraries into the local HBase lib/native directory, which would allow
> >> the
> >>> local test to pass. If you are running in a deployed Hadoop cluster, I
> >>> would expect the necessary libraries to already be available to HBase.
> >>>>>
> >>>>> [1]: https://issues.apache.org/jira/browse/HBASE-6189
> >>>>>
> >>>>> -Sean
> >>>>>
> >>>>>
> >>>>> On Tue, Aug 26, 2014 at 8:30 AM, Arthur.hk.chan@gmail.com <
> >>> arthur.hk.chan@gmail.com> wrote:
> >>>>> Hi JM
> >>>>>
> >>>>> Below are my commands, tried two cases under same source code folder:
> >>>>> a) compile with snappy parameters(failed),
> >>>>> b) compile without snappy parameters (successful).
> >>>>>
> >>>>> Regards
> >>>>> Arthur
> >>>>>
> >>>>> wget
> >>> http://mirrors.devlib.org/apache/hbase/stable/hbase-0.98.4-src.tar.gz
> >>>>> tar -vxf hbase-0.98.4-src.tar.gz
> >>>>> mv hbase-0.98.4 hbase-0.98.4-src_snappy
> >>>>> cd  hbase-0.98.4-src_snappy
> >>>>> nano dev-support/generate-hadoopX-poms.sh
> >>>>>  (change  hbase_home=“/usr/local/hadoop/hbase-0.98.4-src_snappy”)
> >>>>>
> >>>>>
> >>>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 0.98.4-hadoop2
> >>>>> a) with snappy parameters
> >>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
> >>> -Prelease,hadoop-snappy -Dhadoop-snappy.version=0.0.1-SNAPSHOT
> >>>>> [INFO]
> >>>
> ------------------------------------------------------------------------
> >>>>> [INFO] Building HBase - Server 0.98.4-hadoop2
> >>>>> [INFO]
> >>>
> ------------------------------------------------------------------------
> >>>>> [WARNING] The POM for
> >>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT is missing, no
> >>> dependency information available
> >>>>> [INFO]
> >>>
> ------------------------------------------------------------------------
> >>>>> [INFO] Reactor Summary:
> >>>>> [INFO]
> >>>>> [INFO] HBase ............................................. SUCCESS
> >>> [8.192s]
> >>>>> [INFO] HBase - Common .................................... SUCCESS
> >>> [5.638s]
> >>>>> [INFO] HBase - Protocol .................................. SUCCESS
> >>> [1.535s]
> >>>>> [INFO] HBase - Client .................................... SUCCESS
> >>> [1.206s]
> >>>>> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
> >>> [0.193s]
> >>>>> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
> >>> [0.798s]
> >>>>> [INFO] HBase - Prefix Tree ............................... SUCCESS
> >>> [0.438s]
> >>>>> [INFO] HBase - Server .................................... FAILURE
> >>> [0.234s]
> >>>>> [INFO] HBase - Testing Util .............................. SKIPPED
> >>>>> [INFO] HBase - Thrift .................................... SKIPPED
> >>>>> [INFO] HBase - Shell ..................................... SKIPPED
> >>>>> [INFO] HBase - Integration Tests ......................... SKIPPED
> >>>>> [INFO] HBase - Examples .................................. SKIPPED
> >>>>> [INFO] HBase - Assembly .................................. SKIPPED
> >>>>> [INFO]
> >>>
> ------------------------------------------------------------------------
> >>>>> [INFO] BUILD FAILURE
> >>>>> [INFO]
> >>>
> ------------------------------------------------------------------------
> >>>>> [INFO] Total time: 19.474s
> >>>>> [INFO] Finished at: Tue Aug 26 21:21:13 HKT 2014
> >>>>> [INFO] Final Memory: 51M/1100M
> >>>>> [INFO]
> >>>
> ------------------------------------------------------------------------
> >>>>> [ERROR] Failed to execute goal on project hbase-server: Could not
> >>> resolve dependencies for project
> >>> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2: Failure to find
> >>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
> >>> http://maven.oschina.net/content/groups/public/ was cached in the
> local
> >>> repository, resolution will not be reattempted until the update
> interval
> >> of
> >>> nexus-osc has elapsed or updates are forced -> [Help 1]
> >>>>> [ERROR]
> >>>>> [ERROR] To see the full stack trace of the errors, re-run Maven with
> >>> the -e switch.
> >>>>> [ERROR] Re-run Maven using the -X switch to enable full debug
> logging.
> >>>>> [ERROR]
> >>>>> [ERROR] For more information about the errors and possible solutions,
> >>> please read the following articles:
> >>>>> [ERROR] [Help 1]
> >>>
> >>
> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
> >>>>> [ERROR]
> >>>>> [ERROR] After correcting the problems, you can resume the build with
> >>> the command
> >>>>> [ERROR]   mvn <goals> -rf :hbase-server
> >>>>>
> >>>>>
> >>>>>
> >>>>>
> >>>>> b) try again, without snappy parameters
> >>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single -Prelease
> >>>>> [INFO] Building tar:
> >>>
> >>
> /edh/hadoop_all_sources/hbase-0.98.4-src_snappy/hbase-assembly/target/hbase-0.98.4-hadoop2-bin.tar.gz
> >>>>> [INFO]
> >>>
> ------------------------------------------------------------------------
> >>>>> [INFO] Reactor Summary:
> >>>>> [INFO]
> >>>>> [INFO] HBase ............................................. SUCCESS
> >>> [3.290s]
> >>>>> [INFO] HBase - Common .................................... SUCCESS
> >>> [3.119s]
> >>>>> [INFO] HBase - Protocol .................................. SUCCESS
> >>> [0.972s]
> >>>>> [INFO] HBase - Client .................................... SUCCESS
> >>> [0.920s]
> >>>>> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
> >>> [0.167s]
> >>>>> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
> >>> [0.504s]
> >>>>> [INFO] HBase - Prefix Tree ............................... SUCCESS
> >>> [0.382s]
> >>>>> [INFO] HBase - Server .................................... SUCCESS
> >>> [4.790s]
> >>>>> [INFO] HBase - Testing Util .............................. SUCCESS
> >>> [0.598s]
> >>>>> [INFO] HBase - Thrift .................................... SUCCESS
> >>> [1.536s]
> >>>>> [INFO] HBase - Shell ..................................... SUCCESS
> >>> [0.369s]
> >>>>> [INFO] HBase - Integration Tests ......................... SUCCESS
> >>> [0.443s]
> >>>>> [INFO] HBase - Examples .................................. SUCCESS
> >>> [0.459s]
> >>>>> [INFO] HBase - Assembly .................................. SUCCESS
> >>> [13.240s]
> >>>>> [INFO]
> >>>
> ------------------------------------------------------------------------
> >>>>> [INFO] BUILD SUCCESS
> >>>>> [INFO]
> >>>
> ------------------------------------------------------------------------
> >>>>> [INFO] Total time: 31.408s
> >>>>> [INFO] Finished at: Tue Aug 26 21:22:50 HKT 2014
> >>>>> [INFO] Final Memory: 57M/1627M
> >>>>> [INFO]
> >>>
> ------------------------------------------------------------------------
> >>>>>
> >>>>>
> >>>>>
> >>>>>
> >>>>>
> >>>>> On 26 Aug, 2014, at 8:52 pm, Jean-Marc Spaggiari <
> >>> jean-marc@spaggiari.org> wrote:
> >>>>>
> >>>>>> Hi Arthur,
> >>>>>>
> >>>>>> How have you extracted HBase source and what command do you run to
> >>> build? I
> >>>>>> will do the same here locally so I can provide you the exact step to
> >>>>>> complete.
> >>>>>>
> >>>>>> JM
> >>>>>>
> >>>>>>
> >>>>>> 2014-08-26 8:42 GMT-04:00 Arthur.hk.chan@gmail.com <
> >>> arthur.hk.chan@gmail.com
> >>>>>>> :
> >>>>>>
> >>>>>>> Hi JM
> >>>>>>>
> >>>>>>> Not too sure what you mean, do you mean I should create a new
> >> folder
> >>> in my
> >>>>>>> HBASE_SRC named lib/native/Linux-x86 and copy these files to this
> >>> folder
> >>>>>>> then try to compile it again?
> >>>>>>>
> >>>>>>> Regards
> >>>>>>> ARthur
> >>>>>>>
> >>>>>>>
> >>>>>>> On 26 Aug, 2014, at 8:17 pm, Jean-Marc Spaggiari <
> >>> jean-marc@spaggiari.org>
> >>>>>>> wrote:
> >>>>>>>
> >>>>>>>> Hi Arthur,
> >>>>>>>>
> >>>>>>>> Almost done! You now need to copy them on the HBase folder.
> >>>>>>>>
> >>>>>>>> hbase@hbasetest1:~/hbase-0.98.2-hadoop2/lib$ tree | grep -v .jar
> >> |
> >>> grep
> >>>>>>> -v
> >>>>>>>> .rb
> >>>>>>>> .
> >>>>>>>> ├── native
> >>>>>>>> │   └── Linux-x86
> >>>>>>>> │       ├── libsnappy.a
> >>>>>>>> │       ├── libsnappy.la
> >>>>>>>> │       ├── libsnappy.so
> >>>>>>>> │       ├── libsnappy.so.1
> >>>>>>>> │       └── libsnappy.so.1.2.0
> >>>>>>>>
> >>>>>>>> I don't have any hadoop-snappy lib in my hbase folder and it works
> >>> very
> >>>>>>>> well with Snappy for me...
> >>>>>>>>
> >>>>>>>> JM
> >>>>>>>>
> >>>>>>>> 2014-08-26 8:09 GMT-04:00 Arthur.hk.chan@gmail.com <
> >>>>>>> arthur.hk.chan@gmail.com
> >>>>>>>>> :
> >>>>>>>>
> >>>>>>>>> Hi JM,
> >>>>>>>>>
> >>>>>>>>> Below are my steps to install snappy lib, do I miss something?
> >>>>>>>>>
> >>>>>>>>> Regards
> >>>>>>>>> Arthur
> >>>>>>>>>
> >>>>>>>>> wget https://snappy.googlecode.com/files/snappy-1.1.1.tar.gz
> >>>>>>>>> tar -vxf snappy-1.1.1.tar.gz
> >>>>>>>>> cd snappy-1.1.1
> >>>>>>>>> ./configure
> >>>>>>>>> make
> >>>>>>>>> make install
> >>>>>>>>>      make[1]: Entering directory
> >>>>>>> `/edh/hadoop_all_sources/snappy-1.1.1'
> >>>>>>>>>      test -z "/usr/local/lib" || /bin/mkdir -p "/usr/local/lib"
> >>>>>>>>>       /bin/sh ./libtool   --mode=install /usr/bin/install -c
> >>>>>>>>> libsnappy.la '/usr/local/lib'
> >>>>>>>>>      libtool: install: /usr/bin/install -c
> >>> .libs/libsnappy.so.1.2.0
> >>>>>>>>> /usr/local/lib/libsnappy.so.1.2.0
> >>>>>>>>>      libtool: install: (cd /usr/local/lib && { ln -s -f
> >>>>>>>>> libsnappy.so.1.2.0 libsnappy.so.1 || { rm -f libsnappy.so.1 && ln
> >>> -s
> >>>>>>>>> libsnappy.so.1.2.0 libsnappy.so.1; }; })
> >>>>>>>>>      libtool: install: (cd /usr/local/lib && { ln -s -f
> >>>>>>>>> libsnappy.so.1.2.0 libsnappy.so || { rm -f libsnappy.so && ln -s
> >>>>>>>>> libsnappy.so.1.2.0 libsnappy.so; }; })
> >>>>>>>>>      libtool: install: /usr/bin/install -c .libs/libsnappy.lai
> >>>>>>>>> /usr/local/lib/libsnappy.la
> >>>>>>>>>      libtool: install: /usr/bin/install -c .libs/libsnappy.a
> >>>>>>>>> /usr/local/lib/libsnappy.a
> >>>>>>>>>      libtool: install: chmod 644 /usr/local/lib/libsnappy.a
> >>>>>>>>>      libtool: install: ranlib /usr/local/lib/libsnappy.a
> >>>>>>>>>      libtool: finish:
> >>>>>>>>>
> >>>>>>>
> >>>
> >>
> PATH="/edh/hadoop/spark/bin:/edh/hadoop/hbase/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/yarn/hadoop/bin:/edh/hadoop/yarn/hadoop/sbin:/usr/lib64/qt-3.3/bin:/opt/apache-maven-3.1.1/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/hive//bin:/usr/lib/jvm/jdk1.6.0_45//bin:/root/bin:/sbin"
> >>>>>>>>> ldconfig -n /usr/local/lib
> >>>>>>>>>
> >>>>>>>>>
> >>> ----------------------------------------------------------------------
> >>>>>>>>>      Libraries have been installed in:
> >>>>>>>>>      /usr/local/lib
> >>>>>>>>>      If you ever happen to want to link against installed
> >>> libraries
> >>>>>>>>>      in a given directory, LIBDIR, you must either use libtool,
> >>> and
> >>>>>>>>>      specify the full pathname of the library, or use the
> >>> `-LLIBDIR'
> >>>>>>>>>      flag during linking and do at least one of the following:
> >>>>>>>>>      - add LIBDIR to the `LD_LIBRARY_PATH' environment variable
> >>>>>>>>>      during execution
> >>>>>>>>>      - add LIBDIR to the `LD_RUN_PATH' environment variable
> >>>>>>>>>      during linking
> >>>>>>>>>      - use the `-Wl,-rpath -Wl,LIBDIR' linker flag
> >>>>>>>>>      - have your system administrator add LIBDIR to
> >>> `/etc/ld.so.conf'
> >>>>>>>>>      See any operating system documentation about shared
> >>> libraries for
> >>>>>>>>>      more information, such as the ld(1) and ld.so(8) manual
> >>> pages.
> >>>>>>>>>
> >>>>>>>>>
> >>> ----------------------------------------------------------------------
> >>>>>>>>>      test -z "/usr/local/share/doc/snappy" || /bin/mkdir -p
> >>>>>>>>> "/usr/local/share/doc/snappy"
> >>>>>>>>>       /usr/bin/install -c -m 644 ChangeLog COPYING INSTALL NEWS
> >>> README
> >>>>>>>>> format_description.txt framing_format.txt
> >>> '/usr/local/share/doc/snappy'
> >>>>>>>>>      test -z "/usr/local/include" || /bin/mkdir -p
> >>>>>>> "/usr/local/include"
> >>>>>>>>>       /usr/bin/install -c -m 644 snappy.h snappy-sinksource.h
> >>>>>>>>> snappy-stubs-public.h snappy-c.h '/usr/local/include'
> >>>>>>>>>      make[1]: Leaving directory
> >>> `/edh/hadoop_all_sources/snappy-1.1.1'
> >>>>>>>>>
> >>>>>>>>> ll /usr/local/lib
> >>>>>>>>>      -rw-r--r--. 1 root root   233554 Aug 20 00:14 libsnappy.a
> >>>>>>>>>      -rwxr-xr-x. 1 root root      953 Aug 20 00:14 libsnappy.la
> >>>>>>>>>      lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so
> >> ->
> >>>>>>>>> libsnappy.so.1.2.0
> >>>>>>>>>      lrwxrwxrwx. 1 root root       18 Aug 20 00:14
> >> libsnappy.so.1
> >>> ->
> >>>>>>>>> libsnappy.so.1.2.0
> >>>>>>>>>      -rwxr-xr-x. 1 root root   147726 Aug 20 00:14
> >>> libsnappy.so.1.2.0
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>>> On 26 Aug, 2014, at 7:38 pm, Jean-Marc Spaggiari <
> >>>>>>> jean-marc@spaggiari.org>
> >>>>>>>>> wrote:
> >>>>>>>>>
> >>>>>>>>>> Hi Arthur,
> >>>>>>>>>>
> >>>>>>>>>> Do you have snappy libs installed and configured? HBase doesn't
> >>> come
> >>>>>>> with
> >>>>>>>>>> Snappy. So yo need to have it first.
> >>>>>>>>>>
> >>>>>>>>>> Shameless plug:
> >>>>>>>>>>
> >>>>>>>>>
> >>>>>>>
> >>>
> >>
> http://www.spaggiari.org/index.php/hbase/how-to-install-snappy-with-1#.U_xxSqdZuZY
> >>>>>>>>>>
> >>>>>>>>>> This is for 0.96 but should be very similar for 0.98. I will try
> >>> it
> >>>>>>> soon
> >>>>>>>>>> and post and update, but keep us posted here so we can support
> >>> you...
> >>>>>>>>>>
> >>>>>>>>>> JM
> >>>>>>>>>>
> >>>>>>>>>>
> >>>>>>>>>> 2014-08-26 7:34 GMT-04:00 Arthur.hk.chan@gmail.com <
> >>>>>>>>> arthur.hk.chan@gmail.com
> >>>>>>>>>>> :
> >>>>>>>>>>
> >>>>>>>>>>> Hi,
> >>>>>>>>>>>
> >>>>>>>>>>> I need to install snappy to HBase 0.98.4.  (my Hadoop version
> >> is
> >>>>>>> 2.4.1)
> >>>>>>>>>>>
> >>>>>>>>>>> Can you please advise what would be wrong?  Should my pom.xml
> >> be
> >>>>>>>>> incorrect
> >>>>>>>>>>> and missing something?
> >>>>>>>>>>>
> >>>>>>>>>>> Regards
> >>>>>>>>>>> Arthur
> >>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>>>>>>> Below are my commands:
> >>>>>>>>>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4
> >>> 0.98.4-hadoop2
> >>>>>>>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
> >>>>>>>>>>> -Prelease,hadoop-snappy
> >>>>>>>>>>>
> >>>>>>>>>>> Iog:
> >>>>>>>>>>> [INFO]
> >>>>>>>>>>>
> >>>>>>>
> >>>
> ------------------------------------------------------------------------
> >>>>>>>>>>> [INFO] Building HBase - Server 0.98.4-hadoop2
> >>>>>>>>>>> [INFO]
> >>>>>>>>>>>
> >>>>>>>
> >>>
> ------------------------------------------------------------------------
> >>>>>>>>>>> [WARNING] The POM for
> >>>>>>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT
> >>>>>>>>>>> is missing, no dependency information available
> >>>>>>>>>>> [INFO]
> >>>>>>>>>>>
> >>>>>>>
> >>>
> ------------------------------------------------------------------------
> >>>>>>>>>>> [INFO] Reactor Summary:
> >>>>>>>>>>> [INFO]
> >>>>>>>>>>> [INFO] HBase .............................................
> >>> SUCCESS
> >>>>>>>>> [3.129s]
> >>>>>>>>>>> [INFO] HBase - Common ....................................
> >>> SUCCESS
> >>>>>>>>> [3.105s]
> >>>>>>>>>>> [INFO] HBase - Protocol ..................................
> >>> SUCCESS
> >>>>>>>>> [0.976s]
> >>>>>>>>>>> [INFO] HBase - Client ....................................
> >>> SUCCESS
> >>>>>>>>> [0.925s]
> >>>>>>>>>>> [INFO] HBase - Hadoop Compatibility ......................
> >>> SUCCESS
> >>>>>>>>> [0.183s]
> >>>>>>>>>>> [INFO] HBase - Hadoop Two Compatibility ..................
> >>> SUCCESS
> >>>>>>>>> [0.497s]
> >>>>>>>>>>> [INFO] HBase - Prefix Tree ...............................
> >>> SUCCESS
> >>>>>>>>> [0.407s]
> >>>>>>>>>>> [INFO] HBase - Server ....................................
> >>> FAILURE
> >>>>>>>>> [0.103s]
> >>>>>>>>>>> [INFO] HBase - Testing Util ..............................
> >>> SKIPPED
> >>>>>>>>>>> [INFO] HBase - Thrift ....................................
> >>> SKIPPED
> >>>>>>>>>>> [INFO] HBase - Shell .....................................
> >>> SKIPPED
> >>>>>>>>>>> [INFO] HBase - Integration Tests .........................
> >>> SKIPPED
> >>>>>>>>>>> [INFO] HBase - Examples ..................................
> >>> SKIPPED
> >>>>>>>>>>> [INFO] HBase - Assembly ..................................
> >>> SKIPPED
> >>>>>>>>>>> [INFO]
> >>>>>>>>>>>
> >>>>>>>
> >>>
> ------------------------------------------------------------------------
> >>>>>>>>>>> [INFO] BUILD FAILURE
> >>>>>>>>>>> [INFO]
> >>>>>>>>>>>
> >>>>>>>
> >>>
> ------------------------------------------------------------------------
> >>>>>>>>>>> [INFO] Total time: 9.939s
> >>>>>>>>>>> [INFO] Finished at: Tue Aug 26 19:23:14 HKT 2014
> >>>>>>>>>>> [INFO] Final Memory: 61M/2921M
> >>>>>>>>>>> [INFO]
> >>>>>>>>>>>
> >>>>>>>
> >>>
> ------------------------------------------------------------------------
> >>>>>>>>>>> [ERROR] Failed to execute goal on project hbase-server: Could
> >> not
> >>>>>>>>> resolve
> >>>>>>>>>>> dependencies for project
> >>>>>>>>> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2:
> >>>>>>>>>>> Failure to find
> >>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
> >>>>>>>>>>> http://maven.oschina.net/content/groups/public/ was cached in
> >>> the
> >>>>>>> local
> >>>>>>>>>>> repository, resolution will not be reattempted until the update
> >>>>>>>>> interval of
> >>>>>>>>>>> nexus-osc has elapsed or updates are forced -> [Help 1]
> >>>>>>>>>>> [ERROR]
> >>>>>>>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven
> >>> with
> >>>>>>> the
> >>>>>>>>>>> -e switch.
> >>>>>>>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug
> >>> logging.
> >>>>>>>>>>> [ERROR]
> >>>>>>>>>>> [ERROR] For more information about the errors and possible
> >>> solutions,
> >>>>>>>>>>> please read the following articles:
> >>>>>>>>>>> [ERROR] [Help 1]
> >>>>>>>>>>>
> >>>>>>>>>
> >>>>>>>
> >>>
> >>
> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
> >>>>>>>>>>> [ERROR]
> >>>>>>>>>>> [ERROR] After correcting the problems, you can resume the build
> >>> with
> >>>>>>> the
> >>>>>>>>>>> command
> >>>>>>>>>>> [ERROR]   mvn <goals> -rf :hbase-server
> >>>>>>>>>>>
> >>>>>>>>>>>
> >>>>>>>>>
> >>>>>>>>>
> >>>>>>>
> >>>>>>>
> >>>>>
> >>>>>
> >>>>>
> >>>>>
> >>>>> --
> >>>>> Sean
> >>>>
> >>>
> >>>
> >>
>
>

Re: Compilation error: HBASE 0.98.4 with Snappy

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
$ uname -m
x86_64

Arthur

On 27 Aug, 2014, at 7:45 am, Jean-Marc Spaggiari <je...@spaggiari.org> wrote:

> Hi Arthur,
> 
> What uname -m gives you? you need to check that to create the right folder
> under the lib directory.
> 
> JM
> 
> 
> 2014-08-26 19:43 GMT-04:00 Alex Kamil <al...@gmail.com>:
> 
>> Something like this worked for me
>> 1. get hbase binaries
>> 2. sudo yum install snappy snappy-devel
>> 3. ln -sf /usr/lib64/libsnappy.so
>> /var/lib/hadoop/lib/native/Linux-amd64-64/.
>> 4. ln -sf /usr/lib64/libsnappy.so
>> /var/lib/hbase/lib/native/Linux-amd64-64/.
>> 5. add snappy jar under $HADOOP_HOME/lib and $HBASE_HOME/lib
>> ref: https://issues.apache.org/jira/browse/PHOENIX-877
>> 
>> 
>> On Tue, Aug 26, 2014 at 7:25 PM, Arthur.hk.chan@gmail.com <
>> arthur.hk.chan@gmail.com> wrote:
>> 
>>> Hi,
>>> 
>>> I just tried three more steps but was not able to get thru.
>>> 
>>> 
>>> 1) copied  snappy files to $HBASE_HOME/lib
>>> $ cd $HBASE_HOME
>>> $ ll lib/*sna*
>>> -rw-r--r--. 1 hduser hadoop  11526 Aug 27 06:54
>>> lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
>>> -rw-rw-r--. 1 hduser hadoop 995968 Aug  3 18:43
>> lib/snappy-java-1.0.4.1.jar
>>> 
>>> ll lib/native/
>>> drwxrwxr-x. 4 hduser hadoop 4096 Aug 27 06:54 Linux-amd64-64
>>> 
>>> ll lib/native/Linux-amd64-64/
>>> total 18964
>>> lrwxrwxrwx. 1 hduser Hadoop      24 Aug 27 07:08 libhadoopsnappy.so ->
>>> libhadoopsnappy.so.0.0.1
>>> lrwxrwxrwx. 1 hduser Hadoop      24 Aug 27 07:08 libhadoopsnappy.so.0 ->
>>> libhadoopsnappy.so.0.0.1
>>> -rwxr-xr-x. 1 hduser Hadoop   54961 Aug 27 07:08 libhadoopsnappy.so.0.0.1
>>> lrwxrwxrwx. 1 hduser Hadoop      55 Aug 27 07:08 libjvm.so ->
>>> /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
>>> lrwxrwxrwx. 1 hduser Hadoop      25 Aug 27 07:08 libprotobuf-lite.so ->
>>> libprotobuf-lite.so.8.0.0
>>> lrwxrwxrwx. 1 hduser Hadoop      25 Aug 27 07:08 libprotobuf-lite.so.8 ->
>>> libprotobuf-lite.so.8.0.0
>>> -rwxr-xr-x. 1 hduser Hadoop  964689 Aug 27 07:08
>> libprotobuf-lite.so.8.0.0
>>> lrwxrwxrwx. 1 hduser Hadoop      20 Aug 27 07:08 libprotobuf.so ->
>>> libprotobuf.so.8.0.0
>>> lrwxrwxrwx. 1 hduser Hadoop      20 Aug 27 07:08 libprotobuf.so.8 ->
>>> libprotobuf.so.8.0.0
>>> -rwxr-xr-x. 1 hduser Hadoop 8300050 Aug 27 07:08 libprotobuf.so.8.0.0
>>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libprotoc.so ->
>>> libprotoc.so.8.0.0
>>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libprotoc.so.8 ->
>>> libprotoc.so.8.0.0
>>> -rwxr-xr-x. 1 hduser Hadoop 9935810 Aug 27 07:08 libprotoc.so.8.0.0
>>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libsnappy.so ->
>>> libsnappy.so.1.2.0
>>> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libsnappy.so.1 ->
>>> libsnappy.so.1.2.0
>>> -rwxr-xr-x. 1 hduser Hadoop  147726 Aug 27 07:08 libsnappy.so.1.2.0
>>> drwxr-xr-x. 2 hduser Hadoop    4096 Aug 27 07:08 pkgconfig
>>> 
>>> 2)  $HBASE_HOME/conf/hbase-env.sh, added
>>> 
>>> ###
>>> export
>>> 
>> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
>>> export
>>> 
>> HBASE_LIBRARY_PATH=$HBASE_LIBRARY_PATH:$HBASE_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/:$HBASE_HOME/lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
>>> export CLASSPATH=$CLASSPATH:$HBASE_LIBRARY_PATH
>>> export HBASE_CLASSPATH=$HBASE_CLASSPATH:$HBASE_LIBRARY_PATH
>>> ###
>>> 
>>> 3) restart HBASE and tried again
>>> $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
>>> file:///tmp/snappy-test snappy
>>> 2014-08-27 07:16:09,490 INFO  [main] Configuration.deprecation:
>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>>> SLF4J: Class path contains multiple SLF4J bindings.
>>> SLF4J: Found binding in
>>> 
>> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: Found binding in
>>> 
>> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>> explanation.
>>> 2014-08-27 07:16:10,323 INFO  [main] util.ChecksumType: Checksum using
>>> org.apache.hadoop.util.PureJavaCrc32
>>> 2014-08-27 07:16:10,324 INFO  [main] util.ChecksumType: Checksum can use
>>> org.apache.hadoop.util.PureJavaCrc32C
>>> Exception in thread "main" java.lang.RuntimeException: native snappy
>>> library not available: this version of libhadoop was built without snappy
>>> support.
>>>        at
>>> 
>> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
>>>        at
>>> 
>> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>>>        at
>>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>>>        at
>>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>>>        at
>>> 
>> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
>>>        at
>>> 
>> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
>>>        at
>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
>>>        at
>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
>>>        at
>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
>>>        at
>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
>>>        at
>>> 
>> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
>>>        at
>>> 
>> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
>>>        at
>>> 
>> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
>>> 
>>> 
>>> Regards
>>> Arthur
>>> 
>>> 
>>> 
>>> On 27 Aug, 2014, at 6:27 am, Arthur.hk.chan@gmail.com <
>>> arthur.hk.chan@gmail.com> wrote:
>>> 
>>>> Hi Sean,
>>>> 
>>>> Thanks for your reply.
>>>> 
>>>> I tried the following tests
>>>> 
>>>> $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
>>> file:///tmp/snappy-test gz
>>>> 2014-08-26 23:06:17,778 INFO  [main] Configuration.deprecation:
>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>> SLF4J: Found binding in
>>> 
>> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>> SLF4J: Found binding in
>>> 
>> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>> explanation.
>>>> 2014-08-26 23:06:18,103 INFO  [main] util.ChecksumType: Checksum using
>>> org.apache.hadoop.util.PureJavaCrc32
>>>> 2014-08-26 23:06:18,104 INFO  [main] util.ChecksumType: Checksum can
>> use
>>> org.apache.hadoop.util.PureJavaCrc32C
>>>> 2014-08-26 23:06:18,260 INFO  [main] zlib.ZlibFactory: Successfully
>>> loaded & initialized native-zlib library
>>>> 2014-08-26 23:06:18,276 INFO  [main] compress.CodecPool: Got brand-new
>>> compressor [.gz]
>>>> 2014-08-26 23:06:18,280 INFO  [main] compress.CodecPool: Got brand-new
>>> compressor [.gz]
>>>> 2014-08-26 23:06:18,921 INFO  [main] compress.CodecPool: Got brand-new
>>> decompressor [.gz]
>>>> SUCCESS
>>>> 
>>>> 
>>>> $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
>>> file:///tmp/snappy-test snappy
>>>> 2014-08-26 23:07:08,246 INFO  [main] Configuration.deprecation:
>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>> SLF4J: Found binding in
>>> 
>> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>> SLF4J: Found binding in
>>> 
>> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>> explanation.
>>>> 2014-08-26 23:07:08,578 INFO  [main] util.ChecksumType: Checksum using
>>> org.apache.hadoop.util.PureJavaCrc32
>>>> 2014-08-26 23:07:08,579 INFO  [main] util.ChecksumType: Checksum can
>> use
>>> org.apache.hadoop.util.PureJavaCrc32C
>>>> Exception in thread "main" java.lang.RuntimeException: native snappy
>>> library not available: this version of libhadoop was built without snappy
>>> support.
>>>>      at
>>> 
>> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
>>>>      at
>>> 
>> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>>>>      at
>>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>>>>      at
>>> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>>>>      at
>>> 
>> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
>>>>      at
>>> 
>> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
>>>>      at
>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
>>>>      at
>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
>>>>      at
>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
>>>>      at
>>> 
>> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
>>>>      at
>>> 
>> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
>>>>      at
>>> 
>> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
>>>>      at
>>> 
>> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
>>>> 
>>>> 
>>>> $ hbase shell
>>>> 2014-08-27 06:23:38,707 INFO  [main] Configuration.deprecation:
>>> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
>>>> HBase Shell; enter 'help<RETURN>' for list of supported commands.
>>>> Type "exit<RETURN>" to leave the HBase Shell
>>>> Version 0.98.4-hadoop2, rUnknown, Sun Aug  3 23:45:36 HKT 2014
>>>> 
>>>> hbase(main):001:0>
>>>> hbase(main):001:0> create 'tsnappy', { NAME => 'f', COMPRESSION =>
>>> 'snappy'}
>>>> SLF4J: Class path contains multiple SLF4J bindings.
>>>> SLF4J: Found binding in
>>> 
>> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>> SLF4J: Found binding in
>>> 
>> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
>>>> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
>>> explanation.
>>>> 
>>>> ERROR: java.io.IOException: Compression algorithm 'snappy' previously
>>> failed test.
>>>>      at
>>> 
>> org.apache.hadoop.hbase.util.CompressionTest.testCompression(CompressionTest.java:85)
>>>>      at
>>> 
>> org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1764)
>>>>      at
>>> 
>> org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1757)
>>>>      at
>>> org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1739)
>>>>      at
>>> org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1774)
>>>>      at
>>> 
>> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40470)
>>>>      at
>> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2027)
>>>>      at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:98)
>>>>      at
>>> 
>> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
>>>>      at
>>> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
>>>>      at
>>> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
>>>>      at java.util.concurrent.FutureTask.run(FutureTask.java:138)
>>>>      at
>>> 
>> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
>>>>      at
>>> 
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
>>>>      at java.lang.Thread.run(Thread.java:662)
>>>> 
>>>> 
>>>> 
>>>> 
>>>> Regards
>>>> Arthur
>>>> 
>>>> 
>>>> On 26 Aug, 2014, at 11:02 pm, Sean Busbey <bu...@cloudera.com> wrote:
>>>> 
>>>>> Hi Arthur!
>>>>> 
>>>>> Our Snappy build instructions are currently out of date and I'm
>> working
>>> on updating them[1]. In short, I don't think there are any special build
>>> steps for using snappy.
>>>>> 
>>>>> I'm still working out what needs to be included in our instructions
>> for
>>> local and cluster testing.
>>>>> 
>>>>> If you use the test for compression options, locally things will fail
>>> because the native hadoop libs won't be present:
>>>>> 
>>>>> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
>>> file:///tmp/snappy-test snappy
>>>>> (for comparison, replace "snappy" with "gz" and you will get a warning
>>> about not having native libraries, but the test will succeed.)
>>>>> 
>>>>> I believe JM's suggestion is for you to copy the Hadoop native
>>> libraries into the local HBase lib/native directory, which would allow
>> the
>>> local test to pass. If you are running in a deployed Hadoop cluster, I
>>> would expect the necessary libraries to already be available to HBase.
>>>>> 
>>>>> [1]: https://issues.apache.org/jira/browse/HBASE-6189
>>>>> 
>>>>> -Sean
>>>>> 
>>>>> 
>>>>> On Tue, Aug 26, 2014 at 8:30 AM, Arthur.hk.chan@gmail.com <
>>> arthur.hk.chan@gmail.com> wrote:
>>>>> Hi JM
>>>>> 
>>>>> Below are my commands, tried two cases under same source code folder:
>>>>> a) compile with snappy parameters(failed),
>>>>> b) compile without snappy parameters (successful).
>>>>> 
>>>>> Regards
>>>>> Arthur
>>>>> 
>>>>> wget
>>> http://mirrors.devlib.org/apache/hbase/stable/hbase-0.98.4-src.tar.gz
>>>>> tar -vxf hbase-0.98.4-src.tar.gz
>>>>> mv hbase-0.98.4 hbase-0.98.4-src_snappy
>>>>> cd  hbase-0.98.4-src_snappy
>>>>> nano dev-support/generate-hadoopX-poms.sh
>>>>>  (change  hbase_home=“/usr/local/hadoop/hbase-0.98.4-src_snappy”)
>>>>> 
>>>>> 
>>>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 0.98.4-hadoop2
>>>>> a) with snappy parameters
>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
>>> -Prelease,hadoop-snappy -Dhadoop-snappy.version=0.0.1-SNAPSHOT
>>>>> [INFO]
>>> ------------------------------------------------------------------------
>>>>> [INFO] Building HBase - Server 0.98.4-hadoop2
>>>>> [INFO]
>>> ------------------------------------------------------------------------
>>>>> [WARNING] The POM for
>>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT is missing, no
>>> dependency information available
>>>>> [INFO]
>>> ------------------------------------------------------------------------
>>>>> [INFO] Reactor Summary:
>>>>> [INFO]
>>>>> [INFO] HBase ............................................. SUCCESS
>>> [8.192s]
>>>>> [INFO] HBase - Common .................................... SUCCESS
>>> [5.638s]
>>>>> [INFO] HBase - Protocol .................................. SUCCESS
>>> [1.535s]
>>>>> [INFO] HBase - Client .................................... SUCCESS
>>> [1.206s]
>>>>> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
>>> [0.193s]
>>>>> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
>>> [0.798s]
>>>>> [INFO] HBase - Prefix Tree ............................... SUCCESS
>>> [0.438s]
>>>>> [INFO] HBase - Server .................................... FAILURE
>>> [0.234s]
>>>>> [INFO] HBase - Testing Util .............................. SKIPPED
>>>>> [INFO] HBase - Thrift .................................... SKIPPED
>>>>> [INFO] HBase - Shell ..................................... SKIPPED
>>>>> [INFO] HBase - Integration Tests ......................... SKIPPED
>>>>> [INFO] HBase - Examples .................................. SKIPPED
>>>>> [INFO] HBase - Assembly .................................. SKIPPED
>>>>> [INFO]
>>> ------------------------------------------------------------------------
>>>>> [INFO] BUILD FAILURE
>>>>> [INFO]
>>> ------------------------------------------------------------------------
>>>>> [INFO] Total time: 19.474s
>>>>> [INFO] Finished at: Tue Aug 26 21:21:13 HKT 2014
>>>>> [INFO] Final Memory: 51M/1100M
>>>>> [INFO]
>>> ------------------------------------------------------------------------
>>>>> [ERROR] Failed to execute goal on project hbase-server: Could not
>>> resolve dependencies for project
>>> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2: Failure to find
>>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
>>> http://maven.oschina.net/content/groups/public/ was cached in the local
>>> repository, resolution will not be reattempted until the update interval
>> of
>>> nexus-osc has elapsed or updates are forced -> [Help 1]
>>>>> [ERROR]
>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven with
>>> the -e switch.
>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>>>> [ERROR]
>>>>> [ERROR] For more information about the errors and possible solutions,
>>> please read the following articles:
>>>>> [ERROR] [Help 1]
>>> 
>> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
>>>>> [ERROR]
>>>>> [ERROR] After correcting the problems, you can resume the build with
>>> the command
>>>>> [ERROR]   mvn <goals> -rf :hbase-server
>>>>> 
>>>>> 
>>>>> 
>>>>> 
>>>>> b) try again, without snappy parameters
>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single -Prelease
>>>>> [INFO] Building tar:
>>> 
>> /edh/hadoop_all_sources/hbase-0.98.4-src_snappy/hbase-assembly/target/hbase-0.98.4-hadoop2-bin.tar.gz
>>>>> [INFO]
>>> ------------------------------------------------------------------------
>>>>> [INFO] Reactor Summary:
>>>>> [INFO]
>>>>> [INFO] HBase ............................................. SUCCESS
>>> [3.290s]
>>>>> [INFO] HBase - Common .................................... SUCCESS
>>> [3.119s]
>>>>> [INFO] HBase - Protocol .................................. SUCCESS
>>> [0.972s]
>>>>> [INFO] HBase - Client .................................... SUCCESS
>>> [0.920s]
>>>>> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
>>> [0.167s]
>>>>> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
>>> [0.504s]
>>>>> [INFO] HBase - Prefix Tree ............................... SUCCESS
>>> [0.382s]
>>>>> [INFO] HBase - Server .................................... SUCCESS
>>> [4.790s]
>>>>> [INFO] HBase - Testing Util .............................. SUCCESS
>>> [0.598s]
>>>>> [INFO] HBase - Thrift .................................... SUCCESS
>>> [1.536s]
>>>>> [INFO] HBase - Shell ..................................... SUCCESS
>>> [0.369s]
>>>>> [INFO] HBase - Integration Tests ......................... SUCCESS
>>> [0.443s]
>>>>> [INFO] HBase - Examples .................................. SUCCESS
>>> [0.459s]
>>>>> [INFO] HBase - Assembly .................................. SUCCESS
>>> [13.240s]
>>>>> [INFO]
>>> ------------------------------------------------------------------------
>>>>> [INFO] BUILD SUCCESS
>>>>> [INFO]
>>> ------------------------------------------------------------------------
>>>>> [INFO] Total time: 31.408s
>>>>> [INFO] Finished at: Tue Aug 26 21:22:50 HKT 2014
>>>>> [INFO] Final Memory: 57M/1627M
>>>>> [INFO]
>>> ------------------------------------------------------------------------
>>>>> 
>>>>> 
>>>>> 
>>>>> 
>>>>> 
>>>>> On 26 Aug, 2014, at 8:52 pm, Jean-Marc Spaggiari <
>>> jean-marc@spaggiari.org> wrote:
>>>>> 
>>>>>> Hi Arthur,
>>>>>> 
>>>>>> How have you extracted HBase source and what command do you run to
>>> build? I
>>>>>> will do the same here locally so I can provide you the exact step to
>>>>>> complete.
>>>>>> 
>>>>>> JM
>>>>>> 
>>>>>> 
>>>>>> 2014-08-26 8:42 GMT-04:00 Arthur.hk.chan@gmail.com <
>>> arthur.hk.chan@gmail.com
>>>>>>> :
>>>>>> 
>>>>>>> Hi JM
>>>>>>> 
>>>>>>> Not too sure what you mean, do you mean I should create a new
>> folder
>>> in my
>>>>>>> HBASE_SRC named lib/native/Linux-x86 and copy these files to this
>>> folder
>>>>>>> then try to compile it again?
>>>>>>> 
>>>>>>> Regards
>>>>>>> ARthur
>>>>>>> 
>>>>>>> 
>>>>>>> On 26 Aug, 2014, at 8:17 pm, Jean-Marc Spaggiari <
>>> jean-marc@spaggiari.org>
>>>>>>> wrote:
>>>>>>> 
>>>>>>>> Hi Arthur,
>>>>>>>> 
>>>>>>>> Almost done! You now need to copy them on the HBase folder.
>>>>>>>> 
>>>>>>>> hbase@hbasetest1:~/hbase-0.98.2-hadoop2/lib$ tree | grep -v .jar
>> |
>>> grep
>>>>>>> -v
>>>>>>>> .rb
>>>>>>>> .
>>>>>>>> ├── native
>>>>>>>> │   └── Linux-x86
>>>>>>>> │       ├── libsnappy.a
>>>>>>>> │       ├── libsnappy.la
>>>>>>>> │       ├── libsnappy.so
>>>>>>>> │       ├── libsnappy.so.1
>>>>>>>> │       └── libsnappy.so.1.2.0
>>>>>>>> 
>>>>>>>> I don't have any hadoop-snappy lib in my hbase folder and it works
>>> very
>>>>>>>> well with Snappy for me...
>>>>>>>> 
>>>>>>>> JM
>>>>>>>> 
>>>>>>>> 2014-08-26 8:09 GMT-04:00 Arthur.hk.chan@gmail.com <
>>>>>>> arthur.hk.chan@gmail.com
>>>>>>>>> :
>>>>>>>> 
>>>>>>>>> Hi JM,
>>>>>>>>> 
>>>>>>>>> Below are my steps to install snappy lib, do I miss something?
>>>>>>>>> 
>>>>>>>>> Regards
>>>>>>>>> Arthur
>>>>>>>>> 
>>>>>>>>> wget https://snappy.googlecode.com/files/snappy-1.1.1.tar.gz
>>>>>>>>> tar -vxf snappy-1.1.1.tar.gz
>>>>>>>>> cd snappy-1.1.1
>>>>>>>>> ./configure
>>>>>>>>> make
>>>>>>>>> make install
>>>>>>>>>      make[1]: Entering directory
>>>>>>> `/edh/hadoop_all_sources/snappy-1.1.1'
>>>>>>>>>      test -z "/usr/local/lib" || /bin/mkdir -p "/usr/local/lib"
>>>>>>>>>       /bin/sh ./libtool   --mode=install /usr/bin/install -c
>>>>>>>>> libsnappy.la '/usr/local/lib'
>>>>>>>>>      libtool: install: /usr/bin/install -c
>>> .libs/libsnappy.so.1.2.0
>>>>>>>>> /usr/local/lib/libsnappy.so.1.2.0
>>>>>>>>>      libtool: install: (cd /usr/local/lib && { ln -s -f
>>>>>>>>> libsnappy.so.1.2.0 libsnappy.so.1 || { rm -f libsnappy.so.1 && ln
>>> -s
>>>>>>>>> libsnappy.so.1.2.0 libsnappy.so.1; }; })
>>>>>>>>>      libtool: install: (cd /usr/local/lib && { ln -s -f
>>>>>>>>> libsnappy.so.1.2.0 libsnappy.so || { rm -f libsnappy.so && ln -s
>>>>>>>>> libsnappy.so.1.2.0 libsnappy.so; }; })
>>>>>>>>>      libtool: install: /usr/bin/install -c .libs/libsnappy.lai
>>>>>>>>> /usr/local/lib/libsnappy.la
>>>>>>>>>      libtool: install: /usr/bin/install -c .libs/libsnappy.a
>>>>>>>>> /usr/local/lib/libsnappy.a
>>>>>>>>>      libtool: install: chmod 644 /usr/local/lib/libsnappy.a
>>>>>>>>>      libtool: install: ranlib /usr/local/lib/libsnappy.a
>>>>>>>>>      libtool: finish:
>>>>>>>>> 
>>>>>>> 
>>> 
>> PATH="/edh/hadoop/spark/bin:/edh/hadoop/hbase/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/yarn/hadoop/bin:/edh/hadoop/yarn/hadoop/sbin:/usr/lib64/qt-3.3/bin:/opt/apache-maven-3.1.1/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/hive//bin:/usr/lib/jvm/jdk1.6.0_45//bin:/root/bin:/sbin"
>>>>>>>>> ldconfig -n /usr/local/lib
>>>>>>>>> 
>>>>>>>>> 
>>> ----------------------------------------------------------------------
>>>>>>>>>      Libraries have been installed in:
>>>>>>>>>      /usr/local/lib
>>>>>>>>>      If you ever happen to want to link against installed
>>> libraries
>>>>>>>>>      in a given directory, LIBDIR, you must either use libtool,
>>> and
>>>>>>>>>      specify the full pathname of the library, or use the
>>> `-LLIBDIR'
>>>>>>>>>      flag during linking and do at least one of the following:
>>>>>>>>>      - add LIBDIR to the `LD_LIBRARY_PATH' environment variable
>>>>>>>>>      during execution
>>>>>>>>>      - add LIBDIR to the `LD_RUN_PATH' environment variable
>>>>>>>>>      during linking
>>>>>>>>>      - use the `-Wl,-rpath -Wl,LIBDIR' linker flag
>>>>>>>>>      - have your system administrator add LIBDIR to
>>> `/etc/ld.so.conf'
>>>>>>>>>      See any operating system documentation about shared
>>> libraries for
>>>>>>>>>      more information, such as the ld(1) and ld.so(8) manual
>>> pages.
>>>>>>>>> 
>>>>>>>>> 
>>> ----------------------------------------------------------------------
>>>>>>>>>      test -z "/usr/local/share/doc/snappy" || /bin/mkdir -p
>>>>>>>>> "/usr/local/share/doc/snappy"
>>>>>>>>>       /usr/bin/install -c -m 644 ChangeLog COPYING INSTALL NEWS
>>> README
>>>>>>>>> format_description.txt framing_format.txt
>>> '/usr/local/share/doc/snappy'
>>>>>>>>>      test -z "/usr/local/include" || /bin/mkdir -p
>>>>>>> "/usr/local/include"
>>>>>>>>>       /usr/bin/install -c -m 644 snappy.h snappy-sinksource.h
>>>>>>>>> snappy-stubs-public.h snappy-c.h '/usr/local/include'
>>>>>>>>>      make[1]: Leaving directory
>>> `/edh/hadoop_all_sources/snappy-1.1.1'
>>>>>>>>> 
>>>>>>>>> ll /usr/local/lib
>>>>>>>>>      -rw-r--r--. 1 root root   233554 Aug 20 00:14 libsnappy.a
>>>>>>>>>      -rwxr-xr-x. 1 root root      953 Aug 20 00:14 libsnappy.la
>>>>>>>>>      lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so
>> ->
>>>>>>>>> libsnappy.so.1.2.0
>>>>>>>>>      lrwxrwxrwx. 1 root root       18 Aug 20 00:14
>> libsnappy.so.1
>>> ->
>>>>>>>>> libsnappy.so.1.2.0
>>>>>>>>>      -rwxr-xr-x. 1 root root   147726 Aug 20 00:14
>>> libsnappy.so.1.2.0
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> On 26 Aug, 2014, at 7:38 pm, Jean-Marc Spaggiari <
>>>>>>> jean-marc@spaggiari.org>
>>>>>>>>> wrote:
>>>>>>>>> 
>>>>>>>>>> Hi Arthur,
>>>>>>>>>> 
>>>>>>>>>> Do you have snappy libs installed and configured? HBase doesn't
>>> come
>>>>>>> with
>>>>>>>>>> Snappy. So yo need to have it first.
>>>>>>>>>> 
>>>>>>>>>> Shameless plug:
>>>>>>>>>> 
>>>>>>>>> 
>>>>>>> 
>>> 
>> http://www.spaggiari.org/index.php/hbase/how-to-install-snappy-with-1#.U_xxSqdZuZY
>>>>>>>>>> 
>>>>>>>>>> This is for 0.96 but should be very similar for 0.98. I will try
>>> it
>>>>>>> soon
>>>>>>>>>> and post and update, but keep us posted here so we can support
>>> you...
>>>>>>>>>> 
>>>>>>>>>> JM
>>>>>>>>>> 
>>>>>>>>>> 
>>>>>>>>>> 2014-08-26 7:34 GMT-04:00 Arthur.hk.chan@gmail.com <
>>>>>>>>> arthur.hk.chan@gmail.com
>>>>>>>>>>> :
>>>>>>>>>> 
>>>>>>>>>>> Hi,
>>>>>>>>>>> 
>>>>>>>>>>> I need to install snappy to HBase 0.98.4.  (my Hadoop version
>> is
>>>>>>> 2.4.1)
>>>>>>>>>>> 
>>>>>>>>>>> Can you please advise what would be wrong?  Should my pom.xml
>> be
>>>>>>>>> incorrect
>>>>>>>>>>> and missing something?
>>>>>>>>>>> 
>>>>>>>>>>> Regards
>>>>>>>>>>> Arthur
>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>>>>>> Below are my commands:
>>>>>>>>>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4
>>> 0.98.4-hadoop2
>>>>>>>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
>>>>>>>>>>> -Prelease,hadoop-snappy
>>>>>>>>>>> 
>>>>>>>>>>> Iog:
>>>>>>>>>>> [INFO]
>>>>>>>>>>> 
>>>>>>> 
>>> ------------------------------------------------------------------------
>>>>>>>>>>> [INFO] Building HBase - Server 0.98.4-hadoop2
>>>>>>>>>>> [INFO]
>>>>>>>>>>> 
>>>>>>> 
>>> ------------------------------------------------------------------------
>>>>>>>>>>> [WARNING] The POM for
>>>>>>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT
>>>>>>>>>>> is missing, no dependency information available
>>>>>>>>>>> [INFO]
>>>>>>>>>>> 
>>>>>>> 
>>> ------------------------------------------------------------------------
>>>>>>>>>>> [INFO] Reactor Summary:
>>>>>>>>>>> [INFO]
>>>>>>>>>>> [INFO] HBase .............................................
>>> SUCCESS
>>>>>>>>> [3.129s]
>>>>>>>>>>> [INFO] HBase - Common ....................................
>>> SUCCESS
>>>>>>>>> [3.105s]
>>>>>>>>>>> [INFO] HBase - Protocol ..................................
>>> SUCCESS
>>>>>>>>> [0.976s]
>>>>>>>>>>> [INFO] HBase - Client ....................................
>>> SUCCESS
>>>>>>>>> [0.925s]
>>>>>>>>>>> [INFO] HBase - Hadoop Compatibility ......................
>>> SUCCESS
>>>>>>>>> [0.183s]
>>>>>>>>>>> [INFO] HBase - Hadoop Two Compatibility ..................
>>> SUCCESS
>>>>>>>>> [0.497s]
>>>>>>>>>>> [INFO] HBase - Prefix Tree ...............................
>>> SUCCESS
>>>>>>>>> [0.407s]
>>>>>>>>>>> [INFO] HBase - Server ....................................
>>> FAILURE
>>>>>>>>> [0.103s]
>>>>>>>>>>> [INFO] HBase - Testing Util ..............................
>>> SKIPPED
>>>>>>>>>>> [INFO] HBase - Thrift ....................................
>>> SKIPPED
>>>>>>>>>>> [INFO] HBase - Shell .....................................
>>> SKIPPED
>>>>>>>>>>> [INFO] HBase - Integration Tests .........................
>>> SKIPPED
>>>>>>>>>>> [INFO] HBase - Examples ..................................
>>> SKIPPED
>>>>>>>>>>> [INFO] HBase - Assembly ..................................
>>> SKIPPED
>>>>>>>>>>> [INFO]
>>>>>>>>>>> 
>>>>>>> 
>>> ------------------------------------------------------------------------
>>>>>>>>>>> [INFO] BUILD FAILURE
>>>>>>>>>>> [INFO]
>>>>>>>>>>> 
>>>>>>> 
>>> ------------------------------------------------------------------------
>>>>>>>>>>> [INFO] Total time: 9.939s
>>>>>>>>>>> [INFO] Finished at: Tue Aug 26 19:23:14 HKT 2014
>>>>>>>>>>> [INFO] Final Memory: 61M/2921M
>>>>>>>>>>> [INFO]
>>>>>>>>>>> 
>>>>>>> 
>>> ------------------------------------------------------------------------
>>>>>>>>>>> [ERROR] Failed to execute goal on project hbase-server: Could
>> not
>>>>>>>>> resolve
>>>>>>>>>>> dependencies for project
>>>>>>>>> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2:
>>>>>>>>>>> Failure to find
>>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
>>>>>>>>>>> http://maven.oschina.net/content/groups/public/ was cached in
>>> the
>>>>>>> local
>>>>>>>>>>> repository, resolution will not be reattempted until the update
>>>>>>>>> interval of
>>>>>>>>>>> nexus-osc has elapsed or updates are forced -> [Help 1]
>>>>>>>>>>> [ERROR]
>>>>>>>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven
>>> with
>>>>>>> the
>>>>>>>>>>> -e switch.
>>>>>>>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug
>>> logging.
>>>>>>>>>>> [ERROR]
>>>>>>>>>>> [ERROR] For more information about the errors and possible
>>> solutions,
>>>>>>>>>>> please read the following articles:
>>>>>>>>>>> [ERROR] [Help 1]
>>>>>>>>>>> 
>>>>>>>>> 
>>>>>>> 
>>> 
>> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
>>>>>>>>>>> [ERROR]
>>>>>>>>>>> [ERROR] After correcting the problems, you can resume the build
>>> with
>>>>>>> the
>>>>>>>>>>> command
>>>>>>>>>>> [ERROR]   mvn <goals> -rf :hbase-server
>>>>>>>>>>> 
>>>>>>>>>>> 
>>>>>>>>> 
>>>>>>>>> 
>>>>>>> 
>>>>>>> 
>>>>> 
>>>>> 
>>>>> 
>>>>> 
>>>>> --
>>>>> Sean
>>>> 
>>> 
>>> 
>> 


Re: Compilation error: HBASE 0.98.4 with Snappy

Posted by Jean-Marc Spaggiari <je...@spaggiari.org>.
Hi Arthur,

What uname -m gives you? you need to check that to create the right folder
under the lib directory.

JM


2014-08-26 19:43 GMT-04:00 Alex Kamil <al...@gmail.com>:

> Something like this worked for me
> 1. get hbase binaries
> 2. sudo yum install snappy snappy-devel
> 3. ln -sf /usr/lib64/libsnappy.so
> /var/lib/hadoop/lib/native/Linux-amd64-64/.
> 4. ln -sf /usr/lib64/libsnappy.so
> /var/lib/hbase/lib/native/Linux-amd64-64/.
> 5. add snappy jar under $HADOOP_HOME/lib and $HBASE_HOME/lib
> ref: https://issues.apache.org/jira/browse/PHOENIX-877
>
>
> On Tue, Aug 26, 2014 at 7:25 PM, Arthur.hk.chan@gmail.com <
> arthur.hk.chan@gmail.com> wrote:
>
> > Hi,
> >
> > I just tried three more steps but was not able to get thru.
> >
> >
> > 1) copied  snappy files to $HBASE_HOME/lib
> > $ cd $HBASE_HOME
> > $ ll lib/*sna*
> > -rw-r--r--. 1 hduser hadoop  11526 Aug 27 06:54
> > lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
> > -rw-rw-r--. 1 hduser hadoop 995968 Aug  3 18:43
> lib/snappy-java-1.0.4.1.jar
> >
> > ll lib/native/
> > drwxrwxr-x. 4 hduser hadoop 4096 Aug 27 06:54 Linux-amd64-64
> >
> > ll lib/native/Linux-amd64-64/
> > total 18964
> > lrwxrwxrwx. 1 hduser Hadoop      24 Aug 27 07:08 libhadoopsnappy.so ->
> > libhadoopsnappy.so.0.0.1
> > lrwxrwxrwx. 1 hduser Hadoop      24 Aug 27 07:08 libhadoopsnappy.so.0 ->
> > libhadoopsnappy.so.0.0.1
> > -rwxr-xr-x. 1 hduser Hadoop   54961 Aug 27 07:08 libhadoopsnappy.so.0.0.1
> > lrwxrwxrwx. 1 hduser Hadoop      55 Aug 27 07:08 libjvm.so ->
> > /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
> > lrwxrwxrwx. 1 hduser Hadoop      25 Aug 27 07:08 libprotobuf-lite.so ->
> > libprotobuf-lite.so.8.0.0
> > lrwxrwxrwx. 1 hduser Hadoop      25 Aug 27 07:08 libprotobuf-lite.so.8 ->
> > libprotobuf-lite.so.8.0.0
> > -rwxr-xr-x. 1 hduser Hadoop  964689 Aug 27 07:08
> libprotobuf-lite.so.8.0.0
> > lrwxrwxrwx. 1 hduser Hadoop      20 Aug 27 07:08 libprotobuf.so ->
> > libprotobuf.so.8.0.0
> > lrwxrwxrwx. 1 hduser Hadoop      20 Aug 27 07:08 libprotobuf.so.8 ->
> > libprotobuf.so.8.0.0
> > -rwxr-xr-x. 1 hduser Hadoop 8300050 Aug 27 07:08 libprotobuf.so.8.0.0
> > lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libprotoc.so ->
> > libprotoc.so.8.0.0
> > lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libprotoc.so.8 ->
> > libprotoc.so.8.0.0
> > -rwxr-xr-x. 1 hduser Hadoop 9935810 Aug 27 07:08 libprotoc.so.8.0.0
> > lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libsnappy.so ->
> > libsnappy.so.1.2.0
> > lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libsnappy.so.1 ->
> > libsnappy.so.1.2.0
> > -rwxr-xr-x. 1 hduser Hadoop  147726 Aug 27 07:08 libsnappy.so.1.2.0
> > drwxr-xr-x. 2 hduser Hadoop    4096 Aug 27 07:08 pkgconfig
> >
> > 2)  $HBASE_HOME/conf/hbase-env.sh, added
> >
> > ###
> > export
> >
> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
> > export
> >
> HBASE_LIBRARY_PATH=$HBASE_LIBRARY_PATH:$HBASE_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/:$HBASE_HOME/lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
> > export CLASSPATH=$CLASSPATH:$HBASE_LIBRARY_PATH
> > export HBASE_CLASSPATH=$HBASE_CLASSPATH:$HBASE_LIBRARY_PATH
> > ###
> >
> > 3) restart HBASE and tried again
> > $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> > file:///tmp/snappy-test snappy
> > 2014-08-27 07:16:09,490 INFO  [main] Configuration.deprecation:
> > hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> > SLF4J: Class path contains multiple SLF4J bindings.
> > SLF4J: Found binding in
> >
> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: Found binding in
> >
> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> > explanation.
> > 2014-08-27 07:16:10,323 INFO  [main] util.ChecksumType: Checksum using
> > org.apache.hadoop.util.PureJavaCrc32
> > 2014-08-27 07:16:10,324 INFO  [main] util.ChecksumType: Checksum can use
> > org.apache.hadoop.util.PureJavaCrc32C
> > Exception in thread "main" java.lang.RuntimeException: native snappy
> > library not available: this version of libhadoop was built without snappy
> > support.
> >         at
> >
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
> >         at
> >
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
> >         at
> > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
> >         at
> > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
> >         at
> >
> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
> >         at
> >
> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
> >         at
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
> >         at
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
> >         at
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
> >         at
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
> >         at
> >
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
> >         at
> >
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
> >         at
> >
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
> >
> >
> > Regards
> > Arthur
> >
> >
> >
> > On 27 Aug, 2014, at 6:27 am, Arthur.hk.chan@gmail.com <
> > arthur.hk.chan@gmail.com> wrote:
> >
> > > Hi Sean,
> > >
> > > Thanks for your reply.
> > >
> > > I tried the following tests
> > >
> > > $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> > file:///tmp/snappy-test gz
> > > 2014-08-26 23:06:17,778 INFO  [main] Configuration.deprecation:
> > hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> > > SLF4J: Class path contains multiple SLF4J bindings.
> > > SLF4J: Found binding in
> >
> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > > SLF4J: Found binding in
> >
> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> > explanation.
> > > 2014-08-26 23:06:18,103 INFO  [main] util.ChecksumType: Checksum using
> > org.apache.hadoop.util.PureJavaCrc32
> > > 2014-08-26 23:06:18,104 INFO  [main] util.ChecksumType: Checksum can
> use
> > org.apache.hadoop.util.PureJavaCrc32C
> > > 2014-08-26 23:06:18,260 INFO  [main] zlib.ZlibFactory: Successfully
> > loaded & initialized native-zlib library
> > > 2014-08-26 23:06:18,276 INFO  [main] compress.CodecPool: Got brand-new
> > compressor [.gz]
> > > 2014-08-26 23:06:18,280 INFO  [main] compress.CodecPool: Got brand-new
> > compressor [.gz]
> > > 2014-08-26 23:06:18,921 INFO  [main] compress.CodecPool: Got brand-new
> > decompressor [.gz]
> > > SUCCESS
> > >
> > >
> > > $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> > file:///tmp/snappy-test snappy
> > > 2014-08-26 23:07:08,246 INFO  [main] Configuration.deprecation:
> > hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> > > SLF4J: Class path contains multiple SLF4J bindings.
> > > SLF4J: Found binding in
> >
> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > > SLF4J: Found binding in
> >
> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> > explanation.
> > > 2014-08-26 23:07:08,578 INFO  [main] util.ChecksumType: Checksum using
> > org.apache.hadoop.util.PureJavaCrc32
> > > 2014-08-26 23:07:08,579 INFO  [main] util.ChecksumType: Checksum can
> use
> > org.apache.hadoop.util.PureJavaCrc32C
> > > Exception in thread "main" java.lang.RuntimeException: native snappy
> > library not available: this version of libhadoop was built without snappy
> > support.
> > >       at
> >
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
> > >       at
> >
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
> > >       at
> > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
> > >       at
> > org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
> > >       at
> >
> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
> > >       at
> >
> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
> > >       at
> >
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
> > >       at
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
> > >       at
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
> > >       at
> >
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
> > >       at
> >
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
> > >       at
> >
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
> > >       at
> >
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
> > >
> > >
> > > $ hbase shell
> > > 2014-08-27 06:23:38,707 INFO  [main] Configuration.deprecation:
> > hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> > > HBase Shell; enter 'help<RETURN>' for list of supported commands.
> > > Type "exit<RETURN>" to leave the HBase Shell
> > > Version 0.98.4-hadoop2, rUnknown, Sun Aug  3 23:45:36 HKT 2014
> > >
> > > hbase(main):001:0>
> > > hbase(main):001:0> create 'tsnappy', { NAME => 'f', COMPRESSION =>
> > 'snappy'}
> > > SLF4J: Class path contains multiple SLF4J bindings.
> > > SLF4J: Found binding in
> >
> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > > SLF4J: Found binding in
> >
> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> > explanation.
> > >
> > > ERROR: java.io.IOException: Compression algorithm 'snappy' previously
> > failed test.
> > >       at
> >
> org.apache.hadoop.hbase.util.CompressionTest.testCompression(CompressionTest.java:85)
> > >       at
> >
> org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1764)
> > >       at
> >
> org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1757)
> > >       at
> > org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1739)
> > >       at
> > org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1774)
> > >       at
> >
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40470)
> > >       at
> org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2027)
> > >       at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:98)
> > >       at
> >
> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
> > >       at
> > java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
> > >       at
> > java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> > >       at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> > >       at
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
> > >       at
> >
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
> > >       at java.lang.Thread.run(Thread.java:662)
> > >
> > >
> > >
> > >
> > > Regards
> > > Arthur
> > >
> > >
> > > On 26 Aug, 2014, at 11:02 pm, Sean Busbey <bu...@cloudera.com> wrote:
> > >
> > >> Hi Arthur!
> > >>
> > >> Our Snappy build instructions are currently out of date and I'm
> working
> > on updating them[1]. In short, I don't think there are any special build
> > steps for using snappy.
> > >>
> > >> I'm still working out what needs to be included in our instructions
> for
> > local and cluster testing.
> > >>
> > >> If you use the test for compression options, locally things will fail
> > because the native hadoop libs won't be present:
> > >>
> > >> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> > file:///tmp/snappy-test snappy
> > >> (for comparison, replace "snappy" with "gz" and you will get a warning
> > about not having native libraries, but the test will succeed.)
> > >>
> > >> I believe JM's suggestion is for you to copy the Hadoop native
> > libraries into the local HBase lib/native directory, which would allow
> the
> > local test to pass. If you are running in a deployed Hadoop cluster, I
> > would expect the necessary libraries to already be available to HBase.
> > >>
> > >> [1]: https://issues.apache.org/jira/browse/HBASE-6189
> > >>
> > >> -Sean
> > >>
> > >>
> > >> On Tue, Aug 26, 2014 at 8:30 AM, Arthur.hk.chan@gmail.com <
> > arthur.hk.chan@gmail.com> wrote:
> > >> Hi JM
> > >>
> > >> Below are my commands, tried two cases under same source code folder:
> > >> a) compile with snappy parameters(failed),
> > >> b) compile without snappy parameters (successful).
> > >>
> > >> Regards
> > >> Arthur
> > >>
> > >> wget
> > http://mirrors.devlib.org/apache/hbase/stable/hbase-0.98.4-src.tar.gz
> > >> tar -vxf hbase-0.98.4-src.tar.gz
> > >> mv hbase-0.98.4 hbase-0.98.4-src_snappy
> > >> cd  hbase-0.98.4-src_snappy
> > >> nano dev-support/generate-hadoopX-poms.sh
> > >>   (change  hbase_home=“/usr/local/hadoop/hbase-0.98.4-src_snappy”)
> > >>
> > >>
> > >> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 0.98.4-hadoop2
> > >> a) with snappy parameters
> > >> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
> > -Prelease,hadoop-snappy -Dhadoop-snappy.version=0.0.1-SNAPSHOT
> > >> [INFO]
> > ------------------------------------------------------------------------
> > >> [INFO] Building HBase - Server 0.98.4-hadoop2
> > >> [INFO]
> > ------------------------------------------------------------------------
> > >> [WARNING] The POM for
> > org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT is missing, no
> > dependency information available
> > >> [INFO]
> > ------------------------------------------------------------------------
> > >> [INFO] Reactor Summary:
> > >> [INFO]
> > >> [INFO] HBase ............................................. SUCCESS
> > [8.192s]
> > >> [INFO] HBase - Common .................................... SUCCESS
> > [5.638s]
> > >> [INFO] HBase - Protocol .................................. SUCCESS
> > [1.535s]
> > >> [INFO] HBase - Client .................................... SUCCESS
> > [1.206s]
> > >> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
> > [0.193s]
> > >> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
> > [0.798s]
> > >> [INFO] HBase - Prefix Tree ............................... SUCCESS
> > [0.438s]
> > >> [INFO] HBase - Server .................................... FAILURE
> > [0.234s]
> > >> [INFO] HBase - Testing Util .............................. SKIPPED
> > >> [INFO] HBase - Thrift .................................... SKIPPED
> > >> [INFO] HBase - Shell ..................................... SKIPPED
> > >> [INFO] HBase - Integration Tests ......................... SKIPPED
> > >> [INFO] HBase - Examples .................................. SKIPPED
> > >> [INFO] HBase - Assembly .................................. SKIPPED
> > >> [INFO]
> > ------------------------------------------------------------------------
> > >> [INFO] BUILD FAILURE
> > >> [INFO]
> > ------------------------------------------------------------------------
> > >> [INFO] Total time: 19.474s
> > >> [INFO] Finished at: Tue Aug 26 21:21:13 HKT 2014
> > >> [INFO] Final Memory: 51M/1100M
> > >> [INFO]
> > ------------------------------------------------------------------------
> > >> [ERROR] Failed to execute goal on project hbase-server: Could not
> > resolve dependencies for project
> > org.apache.hbase:hbase-server:jar:0.98.4-hadoop2: Failure to find
> > org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
> > http://maven.oschina.net/content/groups/public/ was cached in the local
> > repository, resolution will not be reattempted until the update interval
> of
> > nexus-osc has elapsed or updates are forced -> [Help 1]
> > >> [ERROR]
> > >> [ERROR] To see the full stack trace of the errors, re-run Maven with
> > the -e switch.
> > >> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> > >> [ERROR]
> > >> [ERROR] For more information about the errors and possible solutions,
> > please read the following articles:
> > >> [ERROR] [Help 1]
> >
> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
> > >> [ERROR]
> > >> [ERROR] After correcting the problems, you can resume the build with
> > the command
> > >> [ERROR]   mvn <goals> -rf :hbase-server
> > >>
> > >>
> > >>
> > >>
> > >> b) try again, without snappy parameters
> > >> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single -Prelease
> > >> [INFO] Building tar:
> >
> /edh/hadoop_all_sources/hbase-0.98.4-src_snappy/hbase-assembly/target/hbase-0.98.4-hadoop2-bin.tar.gz
> > >> [INFO]
> > ------------------------------------------------------------------------
> > >> [INFO] Reactor Summary:
> > >> [INFO]
> > >> [INFO] HBase ............................................. SUCCESS
> > [3.290s]
> > >> [INFO] HBase - Common .................................... SUCCESS
> > [3.119s]
> > >> [INFO] HBase - Protocol .................................. SUCCESS
> > [0.972s]
> > >> [INFO] HBase - Client .................................... SUCCESS
> > [0.920s]
> > >> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
> > [0.167s]
> > >> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
> > [0.504s]
> > >> [INFO] HBase - Prefix Tree ............................... SUCCESS
> > [0.382s]
> > >> [INFO] HBase - Server .................................... SUCCESS
> > [4.790s]
> > >> [INFO] HBase - Testing Util .............................. SUCCESS
> > [0.598s]
> > >> [INFO] HBase - Thrift .................................... SUCCESS
> > [1.536s]
> > >> [INFO] HBase - Shell ..................................... SUCCESS
> > [0.369s]
> > >> [INFO] HBase - Integration Tests ......................... SUCCESS
> > [0.443s]
> > >> [INFO] HBase - Examples .................................. SUCCESS
> > [0.459s]
> > >> [INFO] HBase - Assembly .................................. SUCCESS
> > [13.240s]
> > >> [INFO]
> > ------------------------------------------------------------------------
> > >> [INFO] BUILD SUCCESS
> > >> [INFO]
> > ------------------------------------------------------------------------
> > >> [INFO] Total time: 31.408s
> > >> [INFO] Finished at: Tue Aug 26 21:22:50 HKT 2014
> > >> [INFO] Final Memory: 57M/1627M
> > >> [INFO]
> > ------------------------------------------------------------------------
> > >>
> > >>
> > >>
> > >>
> > >>
> > >> On 26 Aug, 2014, at 8:52 pm, Jean-Marc Spaggiari <
> > jean-marc@spaggiari.org> wrote:
> > >>
> > >> > Hi Arthur,
> > >> >
> > >> > How have you extracted HBase source and what command do you run to
> > build? I
> > >> > will do the same here locally so I can provide you the exact step to
> > >> > complete.
> > >> >
> > >> > JM
> > >> >
> > >> >
> > >> > 2014-08-26 8:42 GMT-04:00 Arthur.hk.chan@gmail.com <
> > arthur.hk.chan@gmail.com
> > >> >> :
> > >> >
> > >> >> Hi JM
> > >> >>
> > >> >> Not too sure what you mean, do you mean I should create a new
> folder
> > in my
> > >> >> HBASE_SRC named lib/native/Linux-x86 and copy these files to this
> > folder
> > >> >> then try to compile it again?
> > >> >>
> > >> >> Regards
> > >> >> ARthur
> > >> >>
> > >> >>
> > >> >> On 26 Aug, 2014, at 8:17 pm, Jean-Marc Spaggiari <
> > jean-marc@spaggiari.org>
> > >> >> wrote:
> > >> >>
> > >> >>> Hi Arthur,
> > >> >>>
> > >> >>> Almost done! You now need to copy them on the HBase folder.
> > >> >>>
> > >> >>> hbase@hbasetest1:~/hbase-0.98.2-hadoop2/lib$ tree | grep -v .jar
> |
> > grep
> > >> >> -v
> > >> >>> .rb
> > >> >>> .
> > >> >>> ├── native
> > >> >>> │   └── Linux-x86
> > >> >>> │       ├── libsnappy.a
> > >> >>> │       ├── libsnappy.la
> > >> >>> │       ├── libsnappy.so
> > >> >>> │       ├── libsnappy.so.1
> > >> >>> │       └── libsnappy.so.1.2.0
> > >> >>>
> > >> >>> I don't have any hadoop-snappy lib in my hbase folder and it works
> > very
> > >> >>> well with Snappy for me...
> > >> >>>
> > >> >>> JM
> > >> >>>
> > >> >>> 2014-08-26 8:09 GMT-04:00 Arthur.hk.chan@gmail.com <
> > >> >> arthur.hk.chan@gmail.com
> > >> >>>> :
> > >> >>>
> > >> >>>> Hi JM,
> > >> >>>>
> > >> >>>> Below are my steps to install snappy lib, do I miss something?
> > >> >>>>
> > >> >>>> Regards
> > >> >>>> Arthur
> > >> >>>>
> > >> >>>> wget https://snappy.googlecode.com/files/snappy-1.1.1.tar.gz
> > >> >>>> tar -vxf snappy-1.1.1.tar.gz
> > >> >>>> cd snappy-1.1.1
> > >> >>>> ./configure
> > >> >>>> make
> > >> >>>> make install
> > >> >>>>       make[1]: Entering directory
> > >> >> `/edh/hadoop_all_sources/snappy-1.1.1'
> > >> >>>>       test -z "/usr/local/lib" || /bin/mkdir -p "/usr/local/lib"
> > >> >>>>        /bin/sh ./libtool   --mode=install /usr/bin/install -c
> > >> >>>> libsnappy.la '/usr/local/lib'
> > >> >>>>       libtool: install: /usr/bin/install -c
> > .libs/libsnappy.so.1.2.0
> > >> >>>> /usr/local/lib/libsnappy.so.1.2.0
> > >> >>>>       libtool: install: (cd /usr/local/lib && { ln -s -f
> > >> >>>> libsnappy.so.1.2.0 libsnappy.so.1 || { rm -f libsnappy.so.1 && ln
> > -s
> > >> >>>> libsnappy.so.1.2.0 libsnappy.so.1; }; })
> > >> >>>>       libtool: install: (cd /usr/local/lib && { ln -s -f
> > >> >>>> libsnappy.so.1.2.0 libsnappy.so || { rm -f libsnappy.so && ln -s
> > >> >>>> libsnappy.so.1.2.0 libsnappy.so; }; })
> > >> >>>>       libtool: install: /usr/bin/install -c .libs/libsnappy.lai
> > >> >>>> /usr/local/lib/libsnappy.la
> > >> >>>>       libtool: install: /usr/bin/install -c .libs/libsnappy.a
> > >> >>>> /usr/local/lib/libsnappy.a
> > >> >>>>       libtool: install: chmod 644 /usr/local/lib/libsnappy.a
> > >> >>>>       libtool: install: ranlib /usr/local/lib/libsnappy.a
> > >> >>>>       libtool: finish:
> > >> >>>>
> > >> >>
> >
> PATH="/edh/hadoop/spark/bin:/edh/hadoop/hbase/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/yarn/hadoop/bin:/edh/hadoop/yarn/hadoop/sbin:/usr/lib64/qt-3.3/bin:/opt/apache-maven-3.1.1/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/hive//bin:/usr/lib/jvm/jdk1.6.0_45//bin:/root/bin:/sbin"
> > >> >>>> ldconfig -n /usr/local/lib
> > >> >>>>
> > >> >>>>
> > ----------------------------------------------------------------------
> > >> >>>>       Libraries have been installed in:
> > >> >>>>       /usr/local/lib
> > >> >>>>       If you ever happen to want to link against installed
> > libraries
> > >> >>>>       in a given directory, LIBDIR, you must either use libtool,
> > and
> > >> >>>>       specify the full pathname of the library, or use the
> > `-LLIBDIR'
> > >> >>>>       flag during linking and do at least one of the following:
> > >> >>>>       - add LIBDIR to the `LD_LIBRARY_PATH' environment variable
> > >> >>>>       during execution
> > >> >>>>       - add LIBDIR to the `LD_RUN_PATH' environment variable
> > >> >>>>       during linking
> > >> >>>>       - use the `-Wl,-rpath -Wl,LIBDIR' linker flag
> > >> >>>>       - have your system administrator add LIBDIR to
> > `/etc/ld.so.conf'
> > >> >>>>       See any operating system documentation about shared
> > libraries for
> > >> >>>>       more information, such as the ld(1) and ld.so(8) manual
> > pages.
> > >> >>>>
> > >> >>>>
> > ----------------------------------------------------------------------
> > >> >>>>       test -z "/usr/local/share/doc/snappy" || /bin/mkdir -p
> > >> >>>> "/usr/local/share/doc/snappy"
> > >> >>>>        /usr/bin/install -c -m 644 ChangeLog COPYING INSTALL NEWS
> > README
> > >> >>>> format_description.txt framing_format.txt
> > '/usr/local/share/doc/snappy'
> > >> >>>>       test -z "/usr/local/include" || /bin/mkdir -p
> > >> >> "/usr/local/include"
> > >> >>>>        /usr/bin/install -c -m 644 snappy.h snappy-sinksource.h
> > >> >>>> snappy-stubs-public.h snappy-c.h '/usr/local/include'
> > >> >>>>       make[1]: Leaving directory
> > `/edh/hadoop_all_sources/snappy-1.1.1'
> > >> >>>>
> > >> >>>> ll /usr/local/lib
> > >> >>>>       -rw-r--r--. 1 root root   233554 Aug 20 00:14 libsnappy.a
> > >> >>>>       -rwxr-xr-x. 1 root root      953 Aug 20 00:14 libsnappy.la
> > >> >>>>       lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so
> ->
> > >> >>>> libsnappy.so.1.2.0
> > >> >>>>       lrwxrwxrwx. 1 root root       18 Aug 20 00:14
> libsnappy.so.1
> > ->
> > >> >>>> libsnappy.so.1.2.0
> > >> >>>>       -rwxr-xr-x. 1 root root   147726 Aug 20 00:14
> > libsnappy.so.1.2.0
> > >> >>>>
> > >> >>>>
> > >> >>>>
> > >> >>>> On 26 Aug, 2014, at 7:38 pm, Jean-Marc Spaggiari <
> > >> >> jean-marc@spaggiari.org>
> > >> >>>> wrote:
> > >> >>>>
> > >> >>>>> Hi Arthur,
> > >> >>>>>
> > >> >>>>> Do you have snappy libs installed and configured? HBase doesn't
> > come
> > >> >> with
> > >> >>>>> Snappy. So yo need to have it first.
> > >> >>>>>
> > >> >>>>> Shameless plug:
> > >> >>>>>
> > >> >>>>
> > >> >>
> >
> http://www.spaggiari.org/index.php/hbase/how-to-install-snappy-with-1#.U_xxSqdZuZY
> > >> >>>>>
> > >> >>>>> This is for 0.96 but should be very similar for 0.98. I will try
> > it
> > >> >> soon
> > >> >>>>> and post and update, but keep us posted here so we can support
> > you...
> > >> >>>>>
> > >> >>>>> JM
> > >> >>>>>
> > >> >>>>>
> > >> >>>>> 2014-08-26 7:34 GMT-04:00 Arthur.hk.chan@gmail.com <
> > >> >>>> arthur.hk.chan@gmail.com
> > >> >>>>>> :
> > >> >>>>>
> > >> >>>>>> Hi,
> > >> >>>>>>
> > >> >>>>>> I need to install snappy to HBase 0.98.4.  (my Hadoop version
> is
> > >> >> 2.4.1)
> > >> >>>>>>
> > >> >>>>>> Can you please advise what would be wrong?  Should my pom.xml
> be
> > >> >>>> incorrect
> > >> >>>>>> and missing something?
> > >> >>>>>>
> > >> >>>>>> Regards
> > >> >>>>>> Arthur
> > >> >>>>>>
> > >> >>>>>>
> > >> >>>>>> Below are my commands:
> > >> >>>>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4
> > 0.98.4-hadoop2
> > >> >>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
> > >> >>>>>> -Prelease,hadoop-snappy
> > >> >>>>>>
> > >> >>>>>> Iog:
> > >> >>>>>> [INFO]
> > >> >>>>>>
> > >> >>
> > ------------------------------------------------------------------------
> > >> >>>>>> [INFO] Building HBase - Server 0.98.4-hadoop2
> > >> >>>>>> [INFO]
> > >> >>>>>>
> > >> >>
> > ------------------------------------------------------------------------
> > >> >>>>>> [WARNING] The POM for
> > >> >> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT
> > >> >>>>>> is missing, no dependency information available
> > >> >>>>>> [INFO]
> > >> >>>>>>
> > >> >>
> > ------------------------------------------------------------------------
> > >> >>>>>> [INFO] Reactor Summary:
> > >> >>>>>> [INFO]
> > >> >>>>>> [INFO] HBase .............................................
> > SUCCESS
> > >> >>>> [3.129s]
> > >> >>>>>> [INFO] HBase - Common ....................................
> > SUCCESS
> > >> >>>> [3.105s]
> > >> >>>>>> [INFO] HBase - Protocol ..................................
> > SUCCESS
> > >> >>>> [0.976s]
> > >> >>>>>> [INFO] HBase - Client ....................................
> > SUCCESS
> > >> >>>> [0.925s]
> > >> >>>>>> [INFO] HBase - Hadoop Compatibility ......................
> > SUCCESS
> > >> >>>> [0.183s]
> > >> >>>>>> [INFO] HBase - Hadoop Two Compatibility ..................
> > SUCCESS
> > >> >>>> [0.497s]
> > >> >>>>>> [INFO] HBase - Prefix Tree ...............................
> > SUCCESS
> > >> >>>> [0.407s]
> > >> >>>>>> [INFO] HBase - Server ....................................
> > FAILURE
> > >> >>>> [0.103s]
> > >> >>>>>> [INFO] HBase - Testing Util ..............................
> > SKIPPED
> > >> >>>>>> [INFO] HBase - Thrift ....................................
> > SKIPPED
> > >> >>>>>> [INFO] HBase - Shell .....................................
> > SKIPPED
> > >> >>>>>> [INFO] HBase - Integration Tests .........................
> > SKIPPED
> > >> >>>>>> [INFO] HBase - Examples ..................................
> > SKIPPED
> > >> >>>>>> [INFO] HBase - Assembly ..................................
> > SKIPPED
> > >> >>>>>> [INFO]
> > >> >>>>>>
> > >> >>
> > ------------------------------------------------------------------------
> > >> >>>>>> [INFO] BUILD FAILURE
> > >> >>>>>> [INFO]
> > >> >>>>>>
> > >> >>
> > ------------------------------------------------------------------------
> > >> >>>>>> [INFO] Total time: 9.939s
> > >> >>>>>> [INFO] Finished at: Tue Aug 26 19:23:14 HKT 2014
> > >> >>>>>> [INFO] Final Memory: 61M/2921M
> > >> >>>>>> [INFO]
> > >> >>>>>>
> > >> >>
> > ------------------------------------------------------------------------
> > >> >>>>>> [ERROR] Failed to execute goal on project hbase-server: Could
> not
> > >> >>>> resolve
> > >> >>>>>> dependencies for project
> > >> >>>> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2:
> > >> >>>>>> Failure to find
> > org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
> > >> >>>>>> http://maven.oschina.net/content/groups/public/ was cached in
> > the
> > >> >> local
> > >> >>>>>> repository, resolution will not be reattempted until the update
> > >> >>>> interval of
> > >> >>>>>> nexus-osc has elapsed or updates are forced -> [Help 1]
> > >> >>>>>> [ERROR]
> > >> >>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven
> > with
> > >> >> the
> > >> >>>>>> -e switch.
> > >> >>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug
> > logging.
> > >> >>>>>> [ERROR]
> > >> >>>>>> [ERROR] For more information about the errors and possible
> > solutions,
> > >> >>>>>> please read the following articles:
> > >> >>>>>> [ERROR] [Help 1]
> > >> >>>>>>
> > >> >>>>
> > >> >>
> >
> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
> > >> >>>>>> [ERROR]
> > >> >>>>>> [ERROR] After correcting the problems, you can resume the build
> > with
> > >> >> the
> > >> >>>>>> command
> > >> >>>>>> [ERROR]   mvn <goals> -rf :hbase-server
> > >> >>>>>>
> > >> >>>>>>
> > >> >>>>
> > >> >>>>
> > >> >>
> > >> >>
> > >>
> > >>
> > >>
> > >>
> > >> --
> > >> Sean
> > >
> >
> >
>

Re: Compilation error: HBASE 0.98.4 with Snappy

Posted by Alex Kamil <al...@gmail.com>.
Something like this worked for me
1. get hbase binaries
2. sudo yum install snappy snappy-devel
3. ln -sf /usr/lib64/libsnappy.so
/var/lib/hadoop/lib/native/Linux-amd64-64/.
4. ln -sf /usr/lib64/libsnappy.so /var/lib/hbase/lib/native/Linux-amd64-64/.
5. add snappy jar under $HADOOP_HOME/lib and $HBASE_HOME/lib
ref: https://issues.apache.org/jira/browse/PHOENIX-877


On Tue, Aug 26, 2014 at 7:25 PM, Arthur.hk.chan@gmail.com <
arthur.hk.chan@gmail.com> wrote:

> Hi,
>
> I just tried three more steps but was not able to get thru.
>
>
> 1) copied  snappy files to $HBASE_HOME/lib
> $ cd $HBASE_HOME
> $ ll lib/*sna*
> -rw-r--r--. 1 hduser hadoop  11526 Aug 27 06:54
> lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
> -rw-rw-r--. 1 hduser hadoop 995968 Aug  3 18:43 lib/snappy-java-1.0.4.1.jar
>
> ll lib/native/
> drwxrwxr-x. 4 hduser hadoop 4096 Aug 27 06:54 Linux-amd64-64
>
> ll lib/native/Linux-amd64-64/
> total 18964
> lrwxrwxrwx. 1 hduser Hadoop      24 Aug 27 07:08 libhadoopsnappy.so ->
> libhadoopsnappy.so.0.0.1
> lrwxrwxrwx. 1 hduser Hadoop      24 Aug 27 07:08 libhadoopsnappy.so.0 ->
> libhadoopsnappy.so.0.0.1
> -rwxr-xr-x. 1 hduser Hadoop   54961 Aug 27 07:08 libhadoopsnappy.so.0.0.1
> lrwxrwxrwx. 1 hduser Hadoop      55 Aug 27 07:08 libjvm.so ->
> /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
> lrwxrwxrwx. 1 hduser Hadoop      25 Aug 27 07:08 libprotobuf-lite.so ->
> libprotobuf-lite.so.8.0.0
> lrwxrwxrwx. 1 hduser Hadoop      25 Aug 27 07:08 libprotobuf-lite.so.8 ->
> libprotobuf-lite.so.8.0.0
> -rwxr-xr-x. 1 hduser Hadoop  964689 Aug 27 07:08 libprotobuf-lite.so.8.0.0
> lrwxrwxrwx. 1 hduser Hadoop      20 Aug 27 07:08 libprotobuf.so ->
> libprotobuf.so.8.0.0
> lrwxrwxrwx. 1 hduser Hadoop      20 Aug 27 07:08 libprotobuf.so.8 ->
> libprotobuf.so.8.0.0
> -rwxr-xr-x. 1 hduser Hadoop 8300050 Aug 27 07:08 libprotobuf.so.8.0.0
> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libprotoc.so ->
> libprotoc.so.8.0.0
> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libprotoc.so.8 ->
> libprotoc.so.8.0.0
> -rwxr-xr-x. 1 hduser Hadoop 9935810 Aug 27 07:08 libprotoc.so.8.0.0
> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libsnappy.so ->
> libsnappy.so.1.2.0
> lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libsnappy.so.1 ->
> libsnappy.so.1.2.0
> -rwxr-xr-x. 1 hduser Hadoop  147726 Aug 27 07:08 libsnappy.so.1.2.0
> drwxr-xr-x. 2 hduser Hadoop    4096 Aug 27 07:08 pkgconfig
>
> 2)  $HBASE_HOME/conf/hbase-env.sh, added
>
> ###
> export
> LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
> export
> HBASE_LIBRARY_PATH=$HBASE_LIBRARY_PATH:$HBASE_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/:$HBASE_HOME/lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
> export CLASSPATH=$CLASSPATH:$HBASE_LIBRARY_PATH
> export HBASE_CLASSPATH=$HBASE_CLASSPATH:$HBASE_LIBRARY_PATH
> ###
>
> 3) restart HBASE and tried again
> $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> file:///tmp/snappy-test snappy
> 2014-08-27 07:16:09,490 INFO  [main] Configuration.deprecation:
> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in
> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in
> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> 2014-08-27 07:16:10,323 INFO  [main] util.ChecksumType: Checksum using
> org.apache.hadoop.util.PureJavaCrc32
> 2014-08-27 07:16:10,324 INFO  [main] util.ChecksumType: Checksum can use
> org.apache.hadoop.util.PureJavaCrc32C
> Exception in thread "main" java.lang.RuntimeException: native snappy
> library not available: this version of libhadoop was built without snappy
> support.
>         at
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
>         at
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
>         at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
>         at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
>         at
> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
>         at
> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
>         at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
>         at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
>         at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
>         at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
>         at
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
>         at
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
>         at
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
>
>
> Regards
> Arthur
>
>
>
> On 27 Aug, 2014, at 6:27 am, Arthur.hk.chan@gmail.com <
> arthur.hk.chan@gmail.com> wrote:
>
> > Hi Sean,
> >
> > Thanks for your reply.
> >
> > I tried the following tests
> >
> > $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> file:///tmp/snappy-test gz
> > 2014-08-26 23:06:17,778 INFO  [main] Configuration.deprecation:
> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> > SLF4J: Class path contains multiple SLF4J bindings.
> > SLF4J: Found binding in
> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: Found binding in
> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> > 2014-08-26 23:06:18,103 INFO  [main] util.ChecksumType: Checksum using
> org.apache.hadoop.util.PureJavaCrc32
> > 2014-08-26 23:06:18,104 INFO  [main] util.ChecksumType: Checksum can use
> org.apache.hadoop.util.PureJavaCrc32C
> > 2014-08-26 23:06:18,260 INFO  [main] zlib.ZlibFactory: Successfully
> loaded & initialized native-zlib library
> > 2014-08-26 23:06:18,276 INFO  [main] compress.CodecPool: Got brand-new
> compressor [.gz]
> > 2014-08-26 23:06:18,280 INFO  [main] compress.CodecPool: Got brand-new
> compressor [.gz]
> > 2014-08-26 23:06:18,921 INFO  [main] compress.CodecPool: Got brand-new
> decompressor [.gz]
> > SUCCESS
> >
> >
> > $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> file:///tmp/snappy-test snappy
> > 2014-08-26 23:07:08,246 INFO  [main] Configuration.deprecation:
> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> > SLF4J: Class path contains multiple SLF4J bindings.
> > SLF4J: Found binding in
> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: Found binding in
> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> > 2014-08-26 23:07:08,578 INFO  [main] util.ChecksumType: Checksum using
> org.apache.hadoop.util.PureJavaCrc32
> > 2014-08-26 23:07:08,579 INFO  [main] util.ChecksumType: Checksum can use
> org.apache.hadoop.util.PureJavaCrc32C
> > Exception in thread "main" java.lang.RuntimeException: native snappy
> library not available: this version of libhadoop was built without snappy
> support.
> >       at
> org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
> >       at
> org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
> >       at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
> >       at
> org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
> >       at
> org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
> >       at
> org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
> >       at
> org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
> >       at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
> >       at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
> >       at
> org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
> >       at
> org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
> >       at
> org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
> >       at
> org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
> >
> >
> > $ hbase shell
> > 2014-08-27 06:23:38,707 INFO  [main] Configuration.deprecation:
> hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> > HBase Shell; enter 'help<RETURN>' for list of supported commands.
> > Type "exit<RETURN>" to leave the HBase Shell
> > Version 0.98.4-hadoop2, rUnknown, Sun Aug  3 23:45:36 HKT 2014
> >
> > hbase(main):001:0>
> > hbase(main):001:0> create 'tsnappy', { NAME => 'f', COMPRESSION =>
> 'snappy'}
> > SLF4J: Class path contains multiple SLF4J bindings.
> > SLF4J: Found binding in
> [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: Found binding in
> [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
> explanation.
> >
> > ERROR: java.io.IOException: Compression algorithm 'snappy' previously
> failed test.
> >       at
> org.apache.hadoop.hbase.util.CompressionTest.testCompression(CompressionTest.java:85)
> >       at
> org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1764)
> >       at
> org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1757)
> >       at
> org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1739)
> >       at
> org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1774)
> >       at
> org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40470)
> >       at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2027)
> >       at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:98)
> >       at
> org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
> >       at
> java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
> >       at
> java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> >       at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> >       at
> java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
> >       at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
> >       at java.lang.Thread.run(Thread.java:662)
> >
> >
> >
> >
> > Regards
> > Arthur
> >
> >
> > On 26 Aug, 2014, at 11:02 pm, Sean Busbey <bu...@cloudera.com> wrote:
> >
> >> Hi Arthur!
> >>
> >> Our Snappy build instructions are currently out of date and I'm working
> on updating them[1]. In short, I don't think there are any special build
> steps for using snappy.
> >>
> >> I'm still working out what needs to be included in our instructions for
> local and cluster testing.
> >>
> >> If you use the test for compression options, locally things will fail
> because the native hadoop libs won't be present:
> >>
> >> bin/hbase org.apache.hadoop.hbase.util.CompressionTest
> file:///tmp/snappy-test snappy
> >> (for comparison, replace "snappy" with "gz" and you will get a warning
> about not having native libraries, but the test will succeed.)
> >>
> >> I believe JM's suggestion is for you to copy the Hadoop native
> libraries into the local HBase lib/native directory, which would allow the
> local test to pass. If you are running in a deployed Hadoop cluster, I
> would expect the necessary libraries to already be available to HBase.
> >>
> >> [1]: https://issues.apache.org/jira/browse/HBASE-6189
> >>
> >> -Sean
> >>
> >>
> >> On Tue, Aug 26, 2014 at 8:30 AM, Arthur.hk.chan@gmail.com <
> arthur.hk.chan@gmail.com> wrote:
> >> Hi JM
> >>
> >> Below are my commands, tried two cases under same source code folder:
> >> a) compile with snappy parameters(failed),
> >> b) compile without snappy parameters (successful).
> >>
> >> Regards
> >> Arthur
> >>
> >> wget
> http://mirrors.devlib.org/apache/hbase/stable/hbase-0.98.4-src.tar.gz
> >> tar -vxf hbase-0.98.4-src.tar.gz
> >> mv hbase-0.98.4 hbase-0.98.4-src_snappy
> >> cd  hbase-0.98.4-src_snappy
> >> nano dev-support/generate-hadoopX-poms.sh
> >>   (change  hbase_home=“/usr/local/hadoop/hbase-0.98.4-src_snappy”)
> >>
> >>
> >> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 0.98.4-hadoop2
> >> a) with snappy parameters
> >> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
> -Prelease,hadoop-snappy -Dhadoop-snappy.version=0.0.1-SNAPSHOT
> >> [INFO]
> ------------------------------------------------------------------------
> >> [INFO] Building HBase - Server 0.98.4-hadoop2
> >> [INFO]
> ------------------------------------------------------------------------
> >> [WARNING] The POM for
> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT is missing, no
> dependency information available
> >> [INFO]
> ------------------------------------------------------------------------
> >> [INFO] Reactor Summary:
> >> [INFO]
> >> [INFO] HBase ............................................. SUCCESS
> [8.192s]
> >> [INFO] HBase - Common .................................... SUCCESS
> [5.638s]
> >> [INFO] HBase - Protocol .................................. SUCCESS
> [1.535s]
> >> [INFO] HBase - Client .................................... SUCCESS
> [1.206s]
> >> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
> [0.193s]
> >> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
> [0.798s]
> >> [INFO] HBase - Prefix Tree ............................... SUCCESS
> [0.438s]
> >> [INFO] HBase - Server .................................... FAILURE
> [0.234s]
> >> [INFO] HBase - Testing Util .............................. SKIPPED
> >> [INFO] HBase - Thrift .................................... SKIPPED
> >> [INFO] HBase - Shell ..................................... SKIPPED
> >> [INFO] HBase - Integration Tests ......................... SKIPPED
> >> [INFO] HBase - Examples .................................. SKIPPED
> >> [INFO] HBase - Assembly .................................. SKIPPED
> >> [INFO]
> ------------------------------------------------------------------------
> >> [INFO] BUILD FAILURE
> >> [INFO]
> ------------------------------------------------------------------------
> >> [INFO] Total time: 19.474s
> >> [INFO] Finished at: Tue Aug 26 21:21:13 HKT 2014
> >> [INFO] Final Memory: 51M/1100M
> >> [INFO]
> ------------------------------------------------------------------------
> >> [ERROR] Failed to execute goal on project hbase-server: Could not
> resolve dependencies for project
> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2: Failure to find
> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
> http://maven.oschina.net/content/groups/public/ was cached in the local
> repository, resolution will not be reattempted until the update interval of
> nexus-osc has elapsed or updates are forced -> [Help 1]
> >> [ERROR]
> >> [ERROR] To see the full stack trace of the errors, re-run Maven with
> the -e switch.
> >> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> >> [ERROR]
> >> [ERROR] For more information about the errors and possible solutions,
> please read the following articles:
> >> [ERROR] [Help 1]
> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
> >> [ERROR]
> >> [ERROR] After correcting the problems, you can resume the build with
> the command
> >> [ERROR]   mvn <goals> -rf :hbase-server
> >>
> >>
> >>
> >>
> >> b) try again, without snappy parameters
> >> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single -Prelease
> >> [INFO] Building tar:
> /edh/hadoop_all_sources/hbase-0.98.4-src_snappy/hbase-assembly/target/hbase-0.98.4-hadoop2-bin.tar.gz
> >> [INFO]
> ------------------------------------------------------------------------
> >> [INFO] Reactor Summary:
> >> [INFO]
> >> [INFO] HBase ............................................. SUCCESS
> [3.290s]
> >> [INFO] HBase - Common .................................... SUCCESS
> [3.119s]
> >> [INFO] HBase - Protocol .................................. SUCCESS
> [0.972s]
> >> [INFO] HBase - Client .................................... SUCCESS
> [0.920s]
> >> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
> [0.167s]
> >> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
> [0.504s]
> >> [INFO] HBase - Prefix Tree ............................... SUCCESS
> [0.382s]
> >> [INFO] HBase - Server .................................... SUCCESS
> [4.790s]
> >> [INFO] HBase - Testing Util .............................. SUCCESS
> [0.598s]
> >> [INFO] HBase - Thrift .................................... SUCCESS
> [1.536s]
> >> [INFO] HBase - Shell ..................................... SUCCESS
> [0.369s]
> >> [INFO] HBase - Integration Tests ......................... SUCCESS
> [0.443s]
> >> [INFO] HBase - Examples .................................. SUCCESS
> [0.459s]
> >> [INFO] HBase - Assembly .................................. SUCCESS
> [13.240s]
> >> [INFO]
> ------------------------------------------------------------------------
> >> [INFO] BUILD SUCCESS
> >> [INFO]
> ------------------------------------------------------------------------
> >> [INFO] Total time: 31.408s
> >> [INFO] Finished at: Tue Aug 26 21:22:50 HKT 2014
> >> [INFO] Final Memory: 57M/1627M
> >> [INFO]
> ------------------------------------------------------------------------
> >>
> >>
> >>
> >>
> >>
> >> On 26 Aug, 2014, at 8:52 pm, Jean-Marc Spaggiari <
> jean-marc@spaggiari.org> wrote:
> >>
> >> > Hi Arthur,
> >> >
> >> > How have you extracted HBase source and what command do you run to
> build? I
> >> > will do the same here locally so I can provide you the exact step to
> >> > complete.
> >> >
> >> > JM
> >> >
> >> >
> >> > 2014-08-26 8:42 GMT-04:00 Arthur.hk.chan@gmail.com <
> arthur.hk.chan@gmail.com
> >> >> :
> >> >
> >> >> Hi JM
> >> >>
> >> >> Not too sure what you mean, do you mean I should create a new folder
> in my
> >> >> HBASE_SRC named lib/native/Linux-x86 and copy these files to this
> folder
> >> >> then try to compile it again?
> >> >>
> >> >> Regards
> >> >> ARthur
> >> >>
> >> >>
> >> >> On 26 Aug, 2014, at 8:17 pm, Jean-Marc Spaggiari <
> jean-marc@spaggiari.org>
> >> >> wrote:
> >> >>
> >> >>> Hi Arthur,
> >> >>>
> >> >>> Almost done! You now need to copy them on the HBase folder.
> >> >>>
> >> >>> hbase@hbasetest1:~/hbase-0.98.2-hadoop2/lib$ tree | grep -v .jar |
> grep
> >> >> -v
> >> >>> .rb
> >> >>> .
> >> >>> ├── native
> >> >>> │   └── Linux-x86
> >> >>> │       ├── libsnappy.a
> >> >>> │       ├── libsnappy.la
> >> >>> │       ├── libsnappy.so
> >> >>> │       ├── libsnappy.so.1
> >> >>> │       └── libsnappy.so.1.2.0
> >> >>>
> >> >>> I don't have any hadoop-snappy lib in my hbase folder and it works
> very
> >> >>> well with Snappy for me...
> >> >>>
> >> >>> JM
> >> >>>
> >> >>> 2014-08-26 8:09 GMT-04:00 Arthur.hk.chan@gmail.com <
> >> >> arthur.hk.chan@gmail.com
> >> >>>> :
> >> >>>
> >> >>>> Hi JM,
> >> >>>>
> >> >>>> Below are my steps to install snappy lib, do I miss something?
> >> >>>>
> >> >>>> Regards
> >> >>>> Arthur
> >> >>>>
> >> >>>> wget https://snappy.googlecode.com/files/snappy-1.1.1.tar.gz
> >> >>>> tar -vxf snappy-1.1.1.tar.gz
> >> >>>> cd snappy-1.1.1
> >> >>>> ./configure
> >> >>>> make
> >> >>>> make install
> >> >>>>       make[1]: Entering directory
> >> >> `/edh/hadoop_all_sources/snappy-1.1.1'
> >> >>>>       test -z "/usr/local/lib" || /bin/mkdir -p "/usr/local/lib"
> >> >>>>        /bin/sh ./libtool   --mode=install /usr/bin/install -c
> >> >>>> libsnappy.la '/usr/local/lib'
> >> >>>>       libtool: install: /usr/bin/install -c
> .libs/libsnappy.so.1.2.0
> >> >>>> /usr/local/lib/libsnappy.so.1.2.0
> >> >>>>       libtool: install: (cd /usr/local/lib && { ln -s -f
> >> >>>> libsnappy.so.1.2.0 libsnappy.so.1 || { rm -f libsnappy.so.1 && ln
> -s
> >> >>>> libsnappy.so.1.2.0 libsnappy.so.1; }; })
> >> >>>>       libtool: install: (cd /usr/local/lib && { ln -s -f
> >> >>>> libsnappy.so.1.2.0 libsnappy.so || { rm -f libsnappy.so && ln -s
> >> >>>> libsnappy.so.1.2.0 libsnappy.so; }; })
> >> >>>>       libtool: install: /usr/bin/install -c .libs/libsnappy.lai
> >> >>>> /usr/local/lib/libsnappy.la
> >> >>>>       libtool: install: /usr/bin/install -c .libs/libsnappy.a
> >> >>>> /usr/local/lib/libsnappy.a
> >> >>>>       libtool: install: chmod 644 /usr/local/lib/libsnappy.a
> >> >>>>       libtool: install: ranlib /usr/local/lib/libsnappy.a
> >> >>>>       libtool: finish:
> >> >>>>
> >> >>
> PATH="/edh/hadoop/spark/bin:/edh/hadoop/hbase/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/yarn/hadoop/bin:/edh/hadoop/yarn/hadoop/sbin:/usr/lib64/qt-3.3/bin:/opt/apache-maven-3.1.1/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/hive//bin:/usr/lib/jvm/jdk1.6.0_45//bin:/root/bin:/sbin"
> >> >>>> ldconfig -n /usr/local/lib
> >> >>>>
> >> >>>>
> ----------------------------------------------------------------------
> >> >>>>       Libraries have been installed in:
> >> >>>>       /usr/local/lib
> >> >>>>       If you ever happen to want to link against installed
> libraries
> >> >>>>       in a given directory, LIBDIR, you must either use libtool,
> and
> >> >>>>       specify the full pathname of the library, or use the
> `-LLIBDIR'
> >> >>>>       flag during linking and do at least one of the following:
> >> >>>>       - add LIBDIR to the `LD_LIBRARY_PATH' environment variable
> >> >>>>       during execution
> >> >>>>       - add LIBDIR to the `LD_RUN_PATH' environment variable
> >> >>>>       during linking
> >> >>>>       - use the `-Wl,-rpath -Wl,LIBDIR' linker flag
> >> >>>>       - have your system administrator add LIBDIR to
> `/etc/ld.so.conf'
> >> >>>>       See any operating system documentation about shared
> libraries for
> >> >>>>       more information, such as the ld(1) and ld.so(8) manual
> pages.
> >> >>>>
> >> >>>>
> ----------------------------------------------------------------------
> >> >>>>       test -z "/usr/local/share/doc/snappy" || /bin/mkdir -p
> >> >>>> "/usr/local/share/doc/snappy"
> >> >>>>        /usr/bin/install -c -m 644 ChangeLog COPYING INSTALL NEWS
> README
> >> >>>> format_description.txt framing_format.txt
> '/usr/local/share/doc/snappy'
> >> >>>>       test -z "/usr/local/include" || /bin/mkdir -p
> >> >> "/usr/local/include"
> >> >>>>        /usr/bin/install -c -m 644 snappy.h snappy-sinksource.h
> >> >>>> snappy-stubs-public.h snappy-c.h '/usr/local/include'
> >> >>>>       make[1]: Leaving directory
> `/edh/hadoop_all_sources/snappy-1.1.1'
> >> >>>>
> >> >>>> ll /usr/local/lib
> >> >>>>       -rw-r--r--. 1 root root   233554 Aug 20 00:14 libsnappy.a
> >> >>>>       -rwxr-xr-x. 1 root root      953 Aug 20 00:14 libsnappy.la
> >> >>>>       lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so ->
> >> >>>> libsnappy.so.1.2.0
> >> >>>>       lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so.1
> ->
> >> >>>> libsnappy.so.1.2.0
> >> >>>>       -rwxr-xr-x. 1 root root   147726 Aug 20 00:14
> libsnappy.so.1.2.0
> >> >>>>
> >> >>>>
> >> >>>>
> >> >>>> On 26 Aug, 2014, at 7:38 pm, Jean-Marc Spaggiari <
> >> >> jean-marc@spaggiari.org>
> >> >>>> wrote:
> >> >>>>
> >> >>>>> Hi Arthur,
> >> >>>>>
> >> >>>>> Do you have snappy libs installed and configured? HBase doesn't
> come
> >> >> with
> >> >>>>> Snappy. So yo need to have it first.
> >> >>>>>
> >> >>>>> Shameless plug:
> >> >>>>>
> >> >>>>
> >> >>
> http://www.spaggiari.org/index.php/hbase/how-to-install-snappy-with-1#.U_xxSqdZuZY
> >> >>>>>
> >> >>>>> This is for 0.96 but should be very similar for 0.98. I will try
> it
> >> >> soon
> >> >>>>> and post and update, but keep us posted here so we can support
> you...
> >> >>>>>
> >> >>>>> JM
> >> >>>>>
> >> >>>>>
> >> >>>>> 2014-08-26 7:34 GMT-04:00 Arthur.hk.chan@gmail.com <
> >> >>>> arthur.hk.chan@gmail.com
> >> >>>>>> :
> >> >>>>>
> >> >>>>>> Hi,
> >> >>>>>>
> >> >>>>>> I need to install snappy to HBase 0.98.4.  (my Hadoop version is
> >> >> 2.4.1)
> >> >>>>>>
> >> >>>>>> Can you please advise what would be wrong?  Should my pom.xml be
> >> >>>> incorrect
> >> >>>>>> and missing something?
> >> >>>>>>
> >> >>>>>> Regards
> >> >>>>>> Arthur
> >> >>>>>>
> >> >>>>>>
> >> >>>>>> Below are my commands:
> >> >>>>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4
> 0.98.4-hadoop2
> >> >>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
> >> >>>>>> -Prelease,hadoop-snappy
> >> >>>>>>
> >> >>>>>> Iog:
> >> >>>>>> [INFO]
> >> >>>>>>
> >> >>
> ------------------------------------------------------------------------
> >> >>>>>> [INFO] Building HBase - Server 0.98.4-hadoop2
> >> >>>>>> [INFO]
> >> >>>>>>
> >> >>
> ------------------------------------------------------------------------
> >> >>>>>> [WARNING] The POM for
> >> >> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT
> >> >>>>>> is missing, no dependency information available
> >> >>>>>> [INFO]
> >> >>>>>>
> >> >>
> ------------------------------------------------------------------------
> >> >>>>>> [INFO] Reactor Summary:
> >> >>>>>> [INFO]
> >> >>>>>> [INFO] HBase .............................................
> SUCCESS
> >> >>>> [3.129s]
> >> >>>>>> [INFO] HBase - Common ....................................
> SUCCESS
> >> >>>> [3.105s]
> >> >>>>>> [INFO] HBase - Protocol ..................................
> SUCCESS
> >> >>>> [0.976s]
> >> >>>>>> [INFO] HBase - Client ....................................
> SUCCESS
> >> >>>> [0.925s]
> >> >>>>>> [INFO] HBase - Hadoop Compatibility ......................
> SUCCESS
> >> >>>> [0.183s]
> >> >>>>>> [INFO] HBase - Hadoop Two Compatibility ..................
> SUCCESS
> >> >>>> [0.497s]
> >> >>>>>> [INFO] HBase - Prefix Tree ...............................
> SUCCESS
> >> >>>> [0.407s]
> >> >>>>>> [INFO] HBase - Server ....................................
> FAILURE
> >> >>>> [0.103s]
> >> >>>>>> [INFO] HBase - Testing Util ..............................
> SKIPPED
> >> >>>>>> [INFO] HBase - Thrift ....................................
> SKIPPED
> >> >>>>>> [INFO] HBase - Shell .....................................
> SKIPPED
> >> >>>>>> [INFO] HBase - Integration Tests .........................
> SKIPPED
> >> >>>>>> [INFO] HBase - Examples ..................................
> SKIPPED
> >> >>>>>> [INFO] HBase - Assembly ..................................
> SKIPPED
> >> >>>>>> [INFO]
> >> >>>>>>
> >> >>
> ------------------------------------------------------------------------
> >> >>>>>> [INFO] BUILD FAILURE
> >> >>>>>> [INFO]
> >> >>>>>>
> >> >>
> ------------------------------------------------------------------------
> >> >>>>>> [INFO] Total time: 9.939s
> >> >>>>>> [INFO] Finished at: Tue Aug 26 19:23:14 HKT 2014
> >> >>>>>> [INFO] Final Memory: 61M/2921M
> >> >>>>>> [INFO]
> >> >>>>>>
> >> >>
> ------------------------------------------------------------------------
> >> >>>>>> [ERROR] Failed to execute goal on project hbase-server: Could not
> >> >>>> resolve
> >> >>>>>> dependencies for project
> >> >>>> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2:
> >> >>>>>> Failure to find
> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
> >> >>>>>> http://maven.oschina.net/content/groups/public/ was cached in
> the
> >> >> local
> >> >>>>>> repository, resolution will not be reattempted until the update
> >> >>>> interval of
> >> >>>>>> nexus-osc has elapsed or updates are forced -> [Help 1]
> >> >>>>>> [ERROR]
> >> >>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven
> with
> >> >> the
> >> >>>>>> -e switch.
> >> >>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug
> logging.
> >> >>>>>> [ERROR]
> >> >>>>>> [ERROR] For more information about the errors and possible
> solutions,
> >> >>>>>> please read the following articles:
> >> >>>>>> [ERROR] [Help 1]
> >> >>>>>>
> >> >>>>
> >> >>
> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
> >> >>>>>> [ERROR]
> >> >>>>>> [ERROR] After correcting the problems, you can resume the build
> with
> >> >> the
> >> >>>>>> command
> >> >>>>>> [ERROR]   mvn <goals> -rf :hbase-server
> >> >>>>>>
> >> >>>>>>
> >> >>>>
> >> >>>>
> >> >>
> >> >>
> >>
> >>
> >>
> >>
> >> --
> >> Sean
> >
>
>

Re: Compilation error: HBASE 0.98.4 with Snappy

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
Hi,

I just tried three more steps but was not able to get thru.


1) copied  snappy files to $HBASE_HOME/lib
$ cd $HBASE_HOME
$ ll lib/*sna*
-rw-r--r--. 1 hduser hadoop  11526 Aug 27 06:54 lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
-rw-rw-r--. 1 hduser hadoop 995968 Aug  3 18:43 lib/snappy-java-1.0.4.1.jar

ll lib/native/
drwxrwxr-x. 4 hduser hadoop 4096 Aug 27 06:54 Linux-amd64-64

ll lib/native/Linux-amd64-64/
total 18964
lrwxrwxrwx. 1 hduser Hadoop      24 Aug 27 07:08 libhadoopsnappy.so -> libhadoopsnappy.so.0.0.1
lrwxrwxrwx. 1 hduser Hadoop      24 Aug 27 07:08 libhadoopsnappy.so.0 -> libhadoopsnappy.so.0.0.1
-rwxr-xr-x. 1 hduser Hadoop   54961 Aug 27 07:08 libhadoopsnappy.so.0.0.1
lrwxrwxrwx. 1 hduser Hadoop      55 Aug 27 07:08 libjvm.so -> /usr/lib/jvm/jdk1.6.0_45/jre/lib/amd64/server/libjvm.so
lrwxrwxrwx. 1 hduser Hadoop      25 Aug 27 07:08 libprotobuf-lite.so -> libprotobuf-lite.so.8.0.0
lrwxrwxrwx. 1 hduser Hadoop      25 Aug 27 07:08 libprotobuf-lite.so.8 -> libprotobuf-lite.so.8.0.0
-rwxr-xr-x. 1 hduser Hadoop  964689 Aug 27 07:08 libprotobuf-lite.so.8.0.0
lrwxrwxrwx. 1 hduser Hadoop      20 Aug 27 07:08 libprotobuf.so -> libprotobuf.so.8.0.0
lrwxrwxrwx. 1 hduser Hadoop      20 Aug 27 07:08 libprotobuf.so.8 -> libprotobuf.so.8.0.0
-rwxr-xr-x. 1 hduser Hadoop 8300050 Aug 27 07:08 libprotobuf.so.8.0.0
lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libprotoc.so -> libprotoc.so.8.0.0
lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libprotoc.so.8 -> libprotoc.so.8.0.0
-rwxr-xr-x. 1 hduser Hadoop 9935810 Aug 27 07:08 libprotoc.so.8.0.0
lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libsnappy.so -> libsnappy.so.1.2.0
lrwxrwxrwx. 1 hduser Hadoop      18 Aug 27 07:08 libsnappy.so.1 -> libsnappy.so.1.2.0
-rwxr-xr-x. 1 hduser Hadoop  147726 Aug 27 07:08 libsnappy.so.1.2.0
drwxr-xr-x. 2 hduser Hadoop    4096 Aug 27 07:08 pkgconfig
 
2)  $HBASE_HOME/conf/hbase-env.sh, added

### 
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/
export HBASE_LIBRARY_PATH=$HBASE_LIBRARY_PATH:$HBASE_HOME/lib/native/Linux-amd64-64/:/usr/local/lib/:$HBASE_HOME/lib/hadoop-snappy-0.0.1-SNAPSHOT.jar
export CLASSPATH=$CLASSPATH:$HBASE_LIBRARY_PATH
export HBASE_CLASSPATH=$HBASE_CLASSPATH:$HBASE_LIBRARY_PATH
###

3) restart HBASE and tried again
$ bin/hbase org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test snappy
2014-08-27 07:16:09,490 INFO  [main] Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2014-08-27 07:16:10,323 INFO  [main] util.ChecksumType: Checksum using org.apache.hadoop.util.PureJavaCrc32
2014-08-27 07:16:10,324 INFO  [main] util.ChecksumType: Checksum can use org.apache.hadoop.util.PureJavaCrc32C
Exception in thread "main" java.lang.RuntimeException: native snappy library not available: this version of libhadoop was built without snappy support.
	at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
	at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
	at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
	at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
	at org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
	at org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
	at org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
	at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
	at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
	at org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
	at org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
	at org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
	at org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)


Regards
Arthur



On 27 Aug, 2014, at 6:27 am, Arthur.hk.chan@gmail.com <ar...@gmail.com> wrote:

> Hi Sean,
> 
> Thanks for your reply.
> 
> I tried the following tests
> 
> $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test gz
> 2014-08-26 23:06:17,778 INFO  [main] Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
> 2014-08-26 23:06:18,103 INFO  [main] util.ChecksumType: Checksum using org.apache.hadoop.util.PureJavaCrc32
> 2014-08-26 23:06:18,104 INFO  [main] util.ChecksumType: Checksum can use org.apache.hadoop.util.PureJavaCrc32C
> 2014-08-26 23:06:18,260 INFO  [main] zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
> 2014-08-26 23:06:18,276 INFO  [main] compress.CodecPool: Got brand-new compressor [.gz]
> 2014-08-26 23:06:18,280 INFO  [main] compress.CodecPool: Got brand-new compressor [.gz]
> 2014-08-26 23:06:18,921 INFO  [main] compress.CodecPool: Got brand-new decompressor [.gz]
> SUCCESS
> 
> 
> $ bin/hbase org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test snappy
> 2014-08-26 23:07:08,246 INFO  [main] Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
> 2014-08-26 23:07:08,578 INFO  [main] util.ChecksumType: Checksum using org.apache.hadoop.util.PureJavaCrc32
> 2014-08-26 23:07:08,579 INFO  [main] util.ChecksumType: Checksum can use org.apache.hadoop.util.PureJavaCrc32C
> Exception in thread "main" java.lang.RuntimeException: native snappy library not available: this version of libhadoop was built without snappy support.
> 	at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
> 	at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
> 	at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
> 	at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
> 	at org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
> 	at org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
> 	at org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
> 	at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
> 	at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
> 	at org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
> 	at org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
> 	at org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
> 	at org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)
> 
> 
> $ hbase shell
> 2014-08-27 06:23:38,707 INFO  [main] Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
> HBase Shell; enter 'help<RETURN>' for list of supported commands.
> Type "exit<RETURN>" to leave the HBase Shell
> Version 0.98.4-hadoop2, rUnknown, Sun Aug  3 23:45:36 HKT 2014
> 
> hbase(main):001:0> 
> hbase(main):001:0> create 'tsnappy', { NAME => 'f', COMPRESSION => 'snappy'}
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
> 
> ERROR: java.io.IOException: Compression algorithm 'snappy' previously failed test.
> 	at org.apache.hadoop.hbase.util.CompressionTest.testCompression(CompressionTest.java:85)
> 	at org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1764)
> 	at org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1757)
> 	at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1739)
> 	at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1774)
> 	at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40470)
> 	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2027)
> 	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:98)
> 	at org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
> 	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
> 	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
> 	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
> 	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
> 	at java.lang.Thread.run(Thread.java:662)
> 
> 
> 
> 
> Regards
> Arthur
> 
> 
> On 26 Aug, 2014, at 11:02 pm, Sean Busbey <bu...@cloudera.com> wrote:
> 
>> Hi Arthur!
>> 
>> Our Snappy build instructions are currently out of date and I'm working on updating them[1]. In short, I don't think there are any special build steps for using snappy.
>> 
>> I'm still working out what needs to be included in our instructions for local and cluster testing.
>> 
>> If you use the test for compression options, locally things will fail because the native hadoop libs won't be present:
>> 
>> bin/hbase org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test snappy 
>> (for comparison, replace "snappy" with "gz" and you will get a warning about not having native libraries, but the test will succeed.)
>> 
>> I believe JM's suggestion is for you to copy the Hadoop native libraries into the local HBase lib/native directory, which would allow the local test to pass. If you are running in a deployed Hadoop cluster, I would expect the necessary libraries to already be available to HBase.
>> 
>> [1]: https://issues.apache.org/jira/browse/HBASE-6189
>> 
>> -Sean
>> 
>> 
>> On Tue, Aug 26, 2014 at 8:30 AM, Arthur.hk.chan@gmail.com <ar...@gmail.com> wrote:
>> Hi JM
>> 
>> Below are my commands, tried two cases under same source code folder:
>> a) compile with snappy parameters(failed),
>> b) compile without snappy parameters (successful).
>> 
>> Regards
>> Arthur
>> 
>> wget http://mirrors.devlib.org/apache/hbase/stable/hbase-0.98.4-src.tar.gz
>> tar -vxf hbase-0.98.4-src.tar.gz
>> mv hbase-0.98.4 hbase-0.98.4-src_snappy
>> cd  hbase-0.98.4-src_snappy
>> nano dev-support/generate-hadoopX-poms.sh
>>   (change  hbase_home=“/usr/local/hadoop/hbase-0.98.4-src_snappy”)
>> 
>> 
>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 0.98.4-hadoop2
>> a) with snappy parameters
>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single -Prelease,hadoop-snappy -Dhadoop-snappy.version=0.0.1-SNAPSHOT
>> [INFO] ------------------------------------------------------------------------
>> [INFO] Building HBase - Server 0.98.4-hadoop2
>> [INFO] ------------------------------------------------------------------------
>> [WARNING] The POM for org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT is missing, no dependency information available
>> [INFO] ------------------------------------------------------------------------
>> [INFO] Reactor Summary:
>> [INFO]
>> [INFO] HBase ............................................. SUCCESS [8.192s]
>> [INFO] HBase - Common .................................... SUCCESS [5.638s]
>> [INFO] HBase - Protocol .................................. SUCCESS [1.535s]
>> [INFO] HBase - Client .................................... SUCCESS [1.206s]
>> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS [0.193s]
>> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS [0.798s]
>> [INFO] HBase - Prefix Tree ............................... SUCCESS [0.438s]
>> [INFO] HBase - Server .................................... FAILURE [0.234s]
>> [INFO] HBase - Testing Util .............................. SKIPPED
>> [INFO] HBase - Thrift .................................... SKIPPED
>> [INFO] HBase - Shell ..................................... SKIPPED
>> [INFO] HBase - Integration Tests ......................... SKIPPED
>> [INFO] HBase - Examples .................................. SKIPPED
>> [INFO] HBase - Assembly .................................. SKIPPED
>> [INFO] ------------------------------------------------------------------------
>> [INFO] BUILD FAILURE
>> [INFO] ------------------------------------------------------------------------
>> [INFO] Total time: 19.474s
>> [INFO] Finished at: Tue Aug 26 21:21:13 HKT 2014
>> [INFO] Final Memory: 51M/1100M
>> [INFO] ------------------------------------------------------------------------
>> [ERROR] Failed to execute goal on project hbase-server: Could not resolve dependencies for project org.apache.hbase:hbase-server:jar:0.98.4-hadoop2: Failure to find org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in http://maven.oschina.net/content/groups/public/ was cached in the local repository, resolution will not be reattempted until the update interval of nexus-osc has elapsed or updates are forced -> [Help 1]
>> [ERROR]
>> [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>> [ERROR]
>> [ERROR] For more information about the errors and possible solutions, please read the following articles:
>> [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
>> [ERROR]
>> [ERROR] After correcting the problems, you can resume the build with the command
>> [ERROR]   mvn <goals> -rf :hbase-server
>> 
>> 
>> 
>> 
>> b) try again, without snappy parameters
>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single -Prelease
>> [INFO] Building tar: /edh/hadoop_all_sources/hbase-0.98.4-src_snappy/hbase-assembly/target/hbase-0.98.4-hadoop2-bin.tar.gz
>> [INFO] ------------------------------------------------------------------------
>> [INFO] Reactor Summary:
>> [INFO]
>> [INFO] HBase ............................................. SUCCESS [3.290s]
>> [INFO] HBase - Common .................................... SUCCESS [3.119s]
>> [INFO] HBase - Protocol .................................. SUCCESS [0.972s]
>> [INFO] HBase - Client .................................... SUCCESS [0.920s]
>> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS [0.167s]
>> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS [0.504s]
>> [INFO] HBase - Prefix Tree ............................... SUCCESS [0.382s]
>> [INFO] HBase - Server .................................... SUCCESS [4.790s]
>> [INFO] HBase - Testing Util .............................. SUCCESS [0.598s]
>> [INFO] HBase - Thrift .................................... SUCCESS [1.536s]
>> [INFO] HBase - Shell ..................................... SUCCESS [0.369s]
>> [INFO] HBase - Integration Tests ......................... SUCCESS [0.443s]
>> [INFO] HBase - Examples .................................. SUCCESS [0.459s]
>> [INFO] HBase - Assembly .................................. SUCCESS [13.240s]
>> [INFO] ------------------------------------------------------------------------
>> [INFO] BUILD SUCCESS
>> [INFO] ------------------------------------------------------------------------
>> [INFO] Total time: 31.408s
>> [INFO] Finished at: Tue Aug 26 21:22:50 HKT 2014
>> [INFO] Final Memory: 57M/1627M
>> [INFO] ------------------------------------------------------------------------
>> 
>> 
>> 
>> 
>> 
>> On 26 Aug, 2014, at 8:52 pm, Jean-Marc Spaggiari <je...@spaggiari.org> wrote:
>> 
>> > Hi Arthur,
>> >
>> > How have you extracted HBase source and what command do you run to build? I
>> > will do the same here locally so I can provide you the exact step to
>> > complete.
>> >
>> > JM
>> >
>> >
>> > 2014-08-26 8:42 GMT-04:00 Arthur.hk.chan@gmail.com <arthur.hk.chan@gmail.com
>> >> :
>> >
>> >> Hi JM
>> >>
>> >> Not too sure what you mean, do you mean I should create a new folder in my
>> >> HBASE_SRC named lib/native/Linux-x86 and copy these files to this folder
>> >> then try to compile it again?
>> >>
>> >> Regards
>> >> ARthur
>> >>
>> >>
>> >> On 26 Aug, 2014, at 8:17 pm, Jean-Marc Spaggiari <je...@spaggiari.org>
>> >> wrote:
>> >>
>> >>> Hi Arthur,
>> >>>
>> >>> Almost done! You now need to copy them on the HBase folder.
>> >>>
>> >>> hbase@hbasetest1:~/hbase-0.98.2-hadoop2/lib$ tree | grep -v .jar | grep
>> >> -v
>> >>> .rb
>> >>> .
>> >>> ├── native
>> >>> │   └── Linux-x86
>> >>> │       ├── libsnappy.a
>> >>> │       ├── libsnappy.la
>> >>> │       ├── libsnappy.so
>> >>> │       ├── libsnappy.so.1
>> >>> │       └── libsnappy.so.1.2.0
>> >>>
>> >>> I don't have any hadoop-snappy lib in my hbase folder and it works very
>> >>> well with Snappy for me...
>> >>>
>> >>> JM
>> >>>
>> >>> 2014-08-26 8:09 GMT-04:00 Arthur.hk.chan@gmail.com <
>> >> arthur.hk.chan@gmail.com
>> >>>> :
>> >>>
>> >>>> Hi JM,
>> >>>>
>> >>>> Below are my steps to install snappy lib, do I miss something?
>> >>>>
>> >>>> Regards
>> >>>> Arthur
>> >>>>
>> >>>> wget https://snappy.googlecode.com/files/snappy-1.1.1.tar.gz
>> >>>> tar -vxf snappy-1.1.1.tar.gz
>> >>>> cd snappy-1.1.1
>> >>>> ./configure
>> >>>> make
>> >>>> make install
>> >>>>       make[1]: Entering directory
>> >> `/edh/hadoop_all_sources/snappy-1.1.1'
>> >>>>       test -z "/usr/local/lib" || /bin/mkdir -p "/usr/local/lib"
>> >>>>        /bin/sh ./libtool   --mode=install /usr/bin/install -c
>> >>>> libsnappy.la '/usr/local/lib'
>> >>>>       libtool: install: /usr/bin/install -c .libs/libsnappy.so.1.2.0
>> >>>> /usr/local/lib/libsnappy.so.1.2.0
>> >>>>       libtool: install: (cd /usr/local/lib && { ln -s -f
>> >>>> libsnappy.so.1.2.0 libsnappy.so.1 || { rm -f libsnappy.so.1 && ln -s
>> >>>> libsnappy.so.1.2.0 libsnappy.so.1; }; })
>> >>>>       libtool: install: (cd /usr/local/lib && { ln -s -f
>> >>>> libsnappy.so.1.2.0 libsnappy.so || { rm -f libsnappy.so && ln -s
>> >>>> libsnappy.so.1.2.0 libsnappy.so; }; })
>> >>>>       libtool: install: /usr/bin/install -c .libs/libsnappy.lai
>> >>>> /usr/local/lib/libsnappy.la
>> >>>>       libtool: install: /usr/bin/install -c .libs/libsnappy.a
>> >>>> /usr/local/lib/libsnappy.a
>> >>>>       libtool: install: chmod 644 /usr/local/lib/libsnappy.a
>> >>>>       libtool: install: ranlib /usr/local/lib/libsnappy.a
>> >>>>       libtool: finish:
>> >>>>
>> >> PATH="/edh/hadoop/spark/bin:/edh/hadoop/hbase/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/yarn/hadoop/bin:/edh/hadoop/yarn/hadoop/sbin:/usr/lib64/qt-3.3/bin:/opt/apache-maven-3.1.1/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/hive//bin:/usr/lib/jvm/jdk1.6.0_45//bin:/root/bin:/sbin"
>> >>>> ldconfig -n /usr/local/lib
>> >>>>
>> >>>> ----------------------------------------------------------------------
>> >>>>       Libraries have been installed in:
>> >>>>       /usr/local/lib
>> >>>>       If you ever happen to want to link against installed libraries
>> >>>>       in a given directory, LIBDIR, you must either use libtool, and
>> >>>>       specify the full pathname of the library, or use the `-LLIBDIR'
>> >>>>       flag during linking and do at least one of the following:
>> >>>>       - add LIBDIR to the `LD_LIBRARY_PATH' environment variable
>> >>>>       during execution
>> >>>>       - add LIBDIR to the `LD_RUN_PATH' environment variable
>> >>>>       during linking
>> >>>>       - use the `-Wl,-rpath -Wl,LIBDIR' linker flag
>> >>>>       - have your system administrator add LIBDIR to `/etc/ld.so.conf'
>> >>>>       See any operating system documentation about shared libraries for
>> >>>>       more information, such as the ld(1) and ld.so(8) manual pages.
>> >>>>
>> >>>> ----------------------------------------------------------------------
>> >>>>       test -z "/usr/local/share/doc/snappy" || /bin/mkdir -p
>> >>>> "/usr/local/share/doc/snappy"
>> >>>>        /usr/bin/install -c -m 644 ChangeLog COPYING INSTALL NEWS README
>> >>>> format_description.txt framing_format.txt '/usr/local/share/doc/snappy'
>> >>>>       test -z "/usr/local/include" || /bin/mkdir -p
>> >> "/usr/local/include"
>> >>>>        /usr/bin/install -c -m 644 snappy.h snappy-sinksource.h
>> >>>> snappy-stubs-public.h snappy-c.h '/usr/local/include'
>> >>>>       make[1]: Leaving directory `/edh/hadoop_all_sources/snappy-1.1.1'
>> >>>>
>> >>>> ll /usr/local/lib
>> >>>>       -rw-r--r--. 1 root root   233554 Aug 20 00:14 libsnappy.a
>> >>>>       -rwxr-xr-x. 1 root root      953 Aug 20 00:14 libsnappy.la
>> >>>>       lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so ->
>> >>>> libsnappy.so.1.2.0
>> >>>>       lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so.1 ->
>> >>>> libsnappy.so.1.2.0
>> >>>>       -rwxr-xr-x. 1 root root   147726 Aug 20 00:14 libsnappy.so.1.2.0
>> >>>>
>> >>>>
>> >>>>
>> >>>> On 26 Aug, 2014, at 7:38 pm, Jean-Marc Spaggiari <
>> >> jean-marc@spaggiari.org>
>> >>>> wrote:
>> >>>>
>> >>>>> Hi Arthur,
>> >>>>>
>> >>>>> Do you have snappy libs installed and configured? HBase doesn't come
>> >> with
>> >>>>> Snappy. So yo need to have it first.
>> >>>>>
>> >>>>> Shameless plug:
>> >>>>>
>> >>>>
>> >> http://www.spaggiari.org/index.php/hbase/how-to-install-snappy-with-1#.U_xxSqdZuZY
>> >>>>>
>> >>>>> This is for 0.96 but should be very similar for 0.98. I will try it
>> >> soon
>> >>>>> and post and update, but keep us posted here so we can support you...
>> >>>>>
>> >>>>> JM
>> >>>>>
>> >>>>>
>> >>>>> 2014-08-26 7:34 GMT-04:00 Arthur.hk.chan@gmail.com <
>> >>>> arthur.hk.chan@gmail.com
>> >>>>>> :
>> >>>>>
>> >>>>>> Hi,
>> >>>>>>
>> >>>>>> I need to install snappy to HBase 0.98.4.  (my Hadoop version is
>> >> 2.4.1)
>> >>>>>>
>> >>>>>> Can you please advise what would be wrong?  Should my pom.xml be
>> >>>> incorrect
>> >>>>>> and missing something?
>> >>>>>>
>> >>>>>> Regards
>> >>>>>> Arthur
>> >>>>>>
>> >>>>>>
>> >>>>>> Below are my commands:
>> >>>>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 0.98.4-hadoop2
>> >>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
>> >>>>>> -Prelease,hadoop-snappy
>> >>>>>>
>> >>>>>> Iog:
>> >>>>>> [INFO]
>> >>>>>>
>> >> ------------------------------------------------------------------------
>> >>>>>> [INFO] Building HBase - Server 0.98.4-hadoop2
>> >>>>>> [INFO]
>> >>>>>>
>> >> ------------------------------------------------------------------------
>> >>>>>> [WARNING] The POM for
>> >> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT
>> >>>>>> is missing, no dependency information available
>> >>>>>> [INFO]
>> >>>>>>
>> >> ------------------------------------------------------------------------
>> >>>>>> [INFO] Reactor Summary:
>> >>>>>> [INFO]
>> >>>>>> [INFO] HBase ............................................. SUCCESS
>> >>>> [3.129s]
>> >>>>>> [INFO] HBase - Common .................................... SUCCESS
>> >>>> [3.105s]
>> >>>>>> [INFO] HBase - Protocol .................................. SUCCESS
>> >>>> [0.976s]
>> >>>>>> [INFO] HBase - Client .................................... SUCCESS
>> >>>> [0.925s]
>> >>>>>> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
>> >>>> [0.183s]
>> >>>>>> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
>> >>>> [0.497s]
>> >>>>>> [INFO] HBase - Prefix Tree ............................... SUCCESS
>> >>>> [0.407s]
>> >>>>>> [INFO] HBase - Server .................................... FAILURE
>> >>>> [0.103s]
>> >>>>>> [INFO] HBase - Testing Util .............................. SKIPPED
>> >>>>>> [INFO] HBase - Thrift .................................... SKIPPED
>> >>>>>> [INFO] HBase - Shell ..................................... SKIPPED
>> >>>>>> [INFO] HBase - Integration Tests ......................... SKIPPED
>> >>>>>> [INFO] HBase - Examples .................................. SKIPPED
>> >>>>>> [INFO] HBase - Assembly .................................. SKIPPED
>> >>>>>> [INFO]
>> >>>>>>
>> >> ------------------------------------------------------------------------
>> >>>>>> [INFO] BUILD FAILURE
>> >>>>>> [INFO]
>> >>>>>>
>> >> ------------------------------------------------------------------------
>> >>>>>> [INFO] Total time: 9.939s
>> >>>>>> [INFO] Finished at: Tue Aug 26 19:23:14 HKT 2014
>> >>>>>> [INFO] Final Memory: 61M/2921M
>> >>>>>> [INFO]
>> >>>>>>
>> >> ------------------------------------------------------------------------
>> >>>>>> [ERROR] Failed to execute goal on project hbase-server: Could not
>> >>>> resolve
>> >>>>>> dependencies for project
>> >>>> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2:
>> >>>>>> Failure to find org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
>> >>>>>> http://maven.oschina.net/content/groups/public/ was cached in the
>> >> local
>> >>>>>> repository, resolution will not be reattempted until the update
>> >>>> interval of
>> >>>>>> nexus-osc has elapsed or updates are forced -> [Help 1]
>> >>>>>> [ERROR]
>> >>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven with
>> >> the
>> >>>>>> -e switch.
>> >>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>> >>>>>> [ERROR]
>> >>>>>> [ERROR] For more information about the errors and possible solutions,
>> >>>>>> please read the following articles:
>> >>>>>> [ERROR] [Help 1]
>> >>>>>>
>> >>>>
>> >> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
>> >>>>>> [ERROR]
>> >>>>>> [ERROR] After correcting the problems, you can resume the build with
>> >> the
>> >>>>>> command
>> >>>>>> [ERROR]   mvn <goals> -rf :hbase-server
>> >>>>>>
>> >>>>>>
>> >>>>
>> >>>>
>> >>
>> >>
>> 
>> 
>> 
>> 
>> -- 
>> Sean
> 


Re: Compilation error: HBASE 0.98.4 with Snappy

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
Hi Sean,

Thanks for your reply.

I tried the following tests

$ bin/hbase org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test gz
2014-08-26 23:06:17,778 INFO  [main] Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2014-08-26 23:06:18,103 INFO  [main] util.ChecksumType: Checksum using org.apache.hadoop.util.PureJavaCrc32
2014-08-26 23:06:18,104 INFO  [main] util.ChecksumType: Checksum can use org.apache.hadoop.util.PureJavaCrc32C
2014-08-26 23:06:18,260 INFO  [main] zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
2014-08-26 23:06:18,276 INFO  [main] compress.CodecPool: Got brand-new compressor [.gz]
2014-08-26 23:06:18,280 INFO  [main] compress.CodecPool: Got brand-new compressor [.gz]
2014-08-26 23:06:18,921 INFO  [main] compress.CodecPool: Got brand-new decompressor [.gz]
SUCCESS


$ bin/hbase org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test snappy
2014-08-26 23:07:08,246 INFO  [main] Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2014-08-26 23:07:08,578 INFO  [main] util.ChecksumType: Checksum using org.apache.hadoop.util.PureJavaCrc32
2014-08-26 23:07:08,579 INFO  [main] util.ChecksumType: Checksum can use org.apache.hadoop.util.PureJavaCrc32C
Exception in thread "main" java.lang.RuntimeException: native snappy library not available: this version of libhadoop was built without snappy support.
	at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:64)
	at org.apache.hadoop.io.compress.SnappyCodec.getCompressorType(SnappyCodec.java:132)
	at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:148)
	at org.apache.hadoop.io.compress.CodecPool.getCompressor(CodecPool.java:163)
	at org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getCompressor(Compression.java:310)
	at org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultEncodingContext.<init>(HFileBlockDefaultEncodingContext.java:92)
	at org.apache.hadoop.hbase.io.hfile.HFileBlock$Writer.<init>(HFileBlock.java:690)
	at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.finishInit(HFileWriterV2.java:117)
	at org.apache.hadoop.hbase.io.hfile.HFileWriterV2.<init>(HFileWriterV2.java:109)
	at org.apache.hadoop.hbase.io.hfile.HFileWriterV2$WriterFactoryV2.createWriter(HFileWriterV2.java:97)
	at org.apache.hadoop.hbase.io.hfile.HFile$WriterFactory.create(HFile.java:393)
	at org.apache.hadoop.hbase.util.CompressionTest.doSmokeTest(CompressionTest.java:118)
	at org.apache.hadoop.hbase.util.CompressionTest.main(CompressionTest.java:148)


$ hbase shell
2014-08-27 06:23:38,707 INFO  [main] Configuration.deprecation: hadoop.native.lib is deprecated. Instead, use io.native.lib.available
HBase Shell; enter 'help<RETURN>' for list of supported commands.
Type "exit<RETURN>" to leave the HBase Shell
Version 0.98.4-hadoop2, rUnknown, Sun Aug  3 23:45:36 HKT 2014

hbase(main):001:0> 
hbase(main):001:0> create 'tsnappy', { NAME => 'f', COMPRESSION => 'snappy'}
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/edh/hadoop/hbase-0.98.4-hadoop2/lib/slf4j-log4j12-1.6.4.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/edh/hadoop/hadoop-2.4.1/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.

ERROR: java.io.IOException: Compression algorithm 'snappy' previously failed test.
	at org.apache.hadoop.hbase.util.CompressionTest.testCompression(CompressionTest.java:85)
	at org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1764)
	at org.apache.hadoop.hbase.master.HMaster.checkCompression(HMaster.java:1757)
	at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1739)
	at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1774)
	at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:40470)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2027)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:98)
	at org.apache.hadoop.hbase.ipc.FifoRpcScheduler$1.run(FifoRpcScheduler.java:74)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
	at java.util.concurrent.FutureTask.run(FutureTask.java:138)
	at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:895)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:918)
	at java.lang.Thread.run(Thread.java:662)




Regards
Arthur


On 26 Aug, 2014, at 11:02 pm, Sean Busbey <bu...@cloudera.com> wrote:

> Hi Arthur!
> 
> Our Snappy build instructions are currently out of date and I'm working on updating them[1]. In short, I don't think there are any special build steps for using snappy.
> 
> I'm still working out what needs to be included in our instructions for local and cluster testing.
> 
> If you use the test for compression options, locally things will fail because the native hadoop libs won't be present:
> 
> bin/hbase org.apache.hadoop.hbase.util.CompressionTest file:///tmp/snappy-test snappy 
> (for comparison, replace "snappy" with "gz" and you will get a warning about not having native libraries, but the test will succeed.)
> 
> I believe JM's suggestion is for you to copy the Hadoop native libraries into the local HBase lib/native directory, which would allow the local test to pass. If you are running in a deployed Hadoop cluster, I would expect the necessary libraries to already be available to HBase.
> 
> [1]: https://issues.apache.org/jira/browse/HBASE-6189
> 
> -Sean
> 
> 
> On Tue, Aug 26, 2014 at 8:30 AM, Arthur.hk.chan@gmail.com <ar...@gmail.com> wrote:
> Hi JM
> 
> Below are my commands, tried two cases under same source code folder:
> a) compile with snappy parameters(failed),
> b) compile without snappy parameters (successful).
> 
> Regards
> Arthur
> 
> wget http://mirrors.devlib.org/apache/hbase/stable/hbase-0.98.4-src.tar.gz
> tar -vxf hbase-0.98.4-src.tar.gz
> mv hbase-0.98.4 hbase-0.98.4-src_snappy
> cd  hbase-0.98.4-src_snappy
> nano dev-support/generate-hadoopX-poms.sh
>   (change  hbase_home=“/usr/local/hadoop/hbase-0.98.4-src_snappy”)
> 
> 
> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 0.98.4-hadoop2
> a) with snappy parameters
> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single -Prelease,hadoop-snappy -Dhadoop-snappy.version=0.0.1-SNAPSHOT
> [INFO] ------------------------------------------------------------------------
> [INFO] Building HBase - Server 0.98.4-hadoop2
> [INFO] ------------------------------------------------------------------------
> [WARNING] The POM for org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT is missing, no dependency information available
> [INFO] ------------------------------------------------------------------------
> [INFO] Reactor Summary:
> [INFO]
> [INFO] HBase ............................................. SUCCESS [8.192s]
> [INFO] HBase - Common .................................... SUCCESS [5.638s]
> [INFO] HBase - Protocol .................................. SUCCESS [1.535s]
> [INFO] HBase - Client .................................... SUCCESS [1.206s]
> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS [0.193s]
> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS [0.798s]
> [INFO] HBase - Prefix Tree ............................... SUCCESS [0.438s]
> [INFO] HBase - Server .................................... FAILURE [0.234s]
> [INFO] HBase - Testing Util .............................. SKIPPED
> [INFO] HBase - Thrift .................................... SKIPPED
> [INFO] HBase - Shell ..................................... SKIPPED
> [INFO] HBase - Integration Tests ......................... SKIPPED
> [INFO] HBase - Examples .................................. SKIPPED
> [INFO] HBase - Assembly .................................. SKIPPED
> [INFO] ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO] ------------------------------------------------------------------------
> [INFO] Total time: 19.474s
> [INFO] Finished at: Tue Aug 26 21:21:13 HKT 2014
> [INFO] Final Memory: 51M/1100M
> [INFO] ------------------------------------------------------------------------
> [ERROR] Failed to execute goal on project hbase-server: Could not resolve dependencies for project org.apache.hbase:hbase-server:jar:0.98.4-hadoop2: Failure to find org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in http://maven.oschina.net/content/groups/public/ was cached in the local repository, resolution will not be reattempted until the update interval of nexus-osc has elapsed or updates are forced -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions, please read the following articles:
> [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
> [ERROR]
> [ERROR] After correcting the problems, you can resume the build with the command
> [ERROR]   mvn <goals> -rf :hbase-server
> 
> 
> 
> 
> b) try again, without snappy parameters
> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single -Prelease
> [INFO] Building tar: /edh/hadoop_all_sources/hbase-0.98.4-src_snappy/hbase-assembly/target/hbase-0.98.4-hadoop2-bin.tar.gz
> [INFO] ------------------------------------------------------------------------
> [INFO] Reactor Summary:
> [INFO]
> [INFO] HBase ............................................. SUCCESS [3.290s]
> [INFO] HBase - Common .................................... SUCCESS [3.119s]
> [INFO] HBase - Protocol .................................. SUCCESS [0.972s]
> [INFO] HBase - Client .................................... SUCCESS [0.920s]
> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS [0.167s]
> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS [0.504s]
> [INFO] HBase - Prefix Tree ............................... SUCCESS [0.382s]
> [INFO] HBase - Server .................................... SUCCESS [4.790s]
> [INFO] HBase - Testing Util .............................. SUCCESS [0.598s]
> [INFO] HBase - Thrift .................................... SUCCESS [1.536s]
> [INFO] HBase - Shell ..................................... SUCCESS [0.369s]
> [INFO] HBase - Integration Tests ......................... SUCCESS [0.443s]
> [INFO] HBase - Examples .................................. SUCCESS [0.459s]
> [INFO] HBase - Assembly .................................. SUCCESS [13.240s]
> [INFO] ------------------------------------------------------------------------
> [INFO] BUILD SUCCESS
> [INFO] ------------------------------------------------------------------------
> [INFO] Total time: 31.408s
> [INFO] Finished at: Tue Aug 26 21:22:50 HKT 2014
> [INFO] Final Memory: 57M/1627M
> [INFO] ------------------------------------------------------------------------
> 
> 
> 
> 
> 
> On 26 Aug, 2014, at 8:52 pm, Jean-Marc Spaggiari <je...@spaggiari.org> wrote:
> 
> > Hi Arthur,
> >
> > How have you extracted HBase source and what command do you run to build? I
> > will do the same here locally so I can provide you the exact step to
> > complete.
> >
> > JM
> >
> >
> > 2014-08-26 8:42 GMT-04:00 Arthur.hk.chan@gmail.com <arthur.hk.chan@gmail.com
> >> :
> >
> >> Hi JM
> >>
> >> Not too sure what you mean, do you mean I should create a new folder in my
> >> HBASE_SRC named lib/native/Linux-x86 and copy these files to this folder
> >> then try to compile it again?
> >>
> >> Regards
> >> ARthur
> >>
> >>
> >> On 26 Aug, 2014, at 8:17 pm, Jean-Marc Spaggiari <je...@spaggiari.org>
> >> wrote:
> >>
> >>> Hi Arthur,
> >>>
> >>> Almost done! You now need to copy them on the HBase folder.
> >>>
> >>> hbase@hbasetest1:~/hbase-0.98.2-hadoop2/lib$ tree | grep -v .jar | grep
> >> -v
> >>> .rb
> >>> .
> >>> ├── native
> >>> │   └── Linux-x86
> >>> │       ├── libsnappy.a
> >>> │       ├── libsnappy.la
> >>> │       ├── libsnappy.so
> >>> │       ├── libsnappy.so.1
> >>> │       └── libsnappy.so.1.2.0
> >>>
> >>> I don't have any hadoop-snappy lib in my hbase folder and it works very
> >>> well with Snappy for me...
> >>>
> >>> JM
> >>>
> >>> 2014-08-26 8:09 GMT-04:00 Arthur.hk.chan@gmail.com <
> >> arthur.hk.chan@gmail.com
> >>>> :
> >>>
> >>>> Hi JM,
> >>>>
> >>>> Below are my steps to install snappy lib, do I miss something?
> >>>>
> >>>> Regards
> >>>> Arthur
> >>>>
> >>>> wget https://snappy.googlecode.com/files/snappy-1.1.1.tar.gz
> >>>> tar -vxf snappy-1.1.1.tar.gz
> >>>> cd snappy-1.1.1
> >>>> ./configure
> >>>> make
> >>>> make install
> >>>>       make[1]: Entering directory
> >> `/edh/hadoop_all_sources/snappy-1.1.1'
> >>>>       test -z "/usr/local/lib" || /bin/mkdir -p "/usr/local/lib"
> >>>>        /bin/sh ./libtool   --mode=install /usr/bin/install -c
> >>>> libsnappy.la '/usr/local/lib'
> >>>>       libtool: install: /usr/bin/install -c .libs/libsnappy.so.1.2.0
> >>>> /usr/local/lib/libsnappy.so.1.2.0
> >>>>       libtool: install: (cd /usr/local/lib && { ln -s -f
> >>>> libsnappy.so.1.2.0 libsnappy.so.1 || { rm -f libsnappy.so.1 && ln -s
> >>>> libsnappy.so.1.2.0 libsnappy.so.1; }; })
> >>>>       libtool: install: (cd /usr/local/lib && { ln -s -f
> >>>> libsnappy.so.1.2.0 libsnappy.so || { rm -f libsnappy.so && ln -s
> >>>> libsnappy.so.1.2.0 libsnappy.so; }; })
> >>>>       libtool: install: /usr/bin/install -c .libs/libsnappy.lai
> >>>> /usr/local/lib/libsnappy.la
> >>>>       libtool: install: /usr/bin/install -c .libs/libsnappy.a
> >>>> /usr/local/lib/libsnappy.a
> >>>>       libtool: install: chmod 644 /usr/local/lib/libsnappy.a
> >>>>       libtool: install: ranlib /usr/local/lib/libsnappy.a
> >>>>       libtool: finish:
> >>>>
> >> PATH="/edh/hadoop/spark/bin:/edh/hadoop/hbase/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/yarn/hadoop/bin:/edh/hadoop/yarn/hadoop/sbin:/usr/lib64/qt-3.3/bin:/opt/apache-maven-3.1.1/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/hive//bin:/usr/lib/jvm/jdk1.6.0_45//bin:/root/bin:/sbin"
> >>>> ldconfig -n /usr/local/lib
> >>>>
> >>>> ----------------------------------------------------------------------
> >>>>       Libraries have been installed in:
> >>>>       /usr/local/lib
> >>>>       If you ever happen to want to link against installed libraries
> >>>>       in a given directory, LIBDIR, you must either use libtool, and
> >>>>       specify the full pathname of the library, or use the `-LLIBDIR'
> >>>>       flag during linking and do at least one of the following:
> >>>>       - add LIBDIR to the `LD_LIBRARY_PATH' environment variable
> >>>>       during execution
> >>>>       - add LIBDIR to the `LD_RUN_PATH' environment variable
> >>>>       during linking
> >>>>       - use the `-Wl,-rpath -Wl,LIBDIR' linker flag
> >>>>       - have your system administrator add LIBDIR to `/etc/ld.so.conf'
> >>>>       See any operating system documentation about shared libraries for
> >>>>       more information, such as the ld(1) and ld.so(8) manual pages.
> >>>>
> >>>> ----------------------------------------------------------------------
> >>>>       test -z "/usr/local/share/doc/snappy" || /bin/mkdir -p
> >>>> "/usr/local/share/doc/snappy"
> >>>>        /usr/bin/install -c -m 644 ChangeLog COPYING INSTALL NEWS README
> >>>> format_description.txt framing_format.txt '/usr/local/share/doc/snappy'
> >>>>       test -z "/usr/local/include" || /bin/mkdir -p
> >> "/usr/local/include"
> >>>>        /usr/bin/install -c -m 644 snappy.h snappy-sinksource.h
> >>>> snappy-stubs-public.h snappy-c.h '/usr/local/include'
> >>>>       make[1]: Leaving directory `/edh/hadoop_all_sources/snappy-1.1.1'
> >>>>
> >>>> ll /usr/local/lib
> >>>>       -rw-r--r--. 1 root root   233554 Aug 20 00:14 libsnappy.a
> >>>>       -rwxr-xr-x. 1 root root      953 Aug 20 00:14 libsnappy.la
> >>>>       lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so ->
> >>>> libsnappy.so.1.2.0
> >>>>       lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so.1 ->
> >>>> libsnappy.so.1.2.0
> >>>>       -rwxr-xr-x. 1 root root   147726 Aug 20 00:14 libsnappy.so.1.2.0
> >>>>
> >>>>
> >>>>
> >>>> On 26 Aug, 2014, at 7:38 pm, Jean-Marc Spaggiari <
> >> jean-marc@spaggiari.org>
> >>>> wrote:
> >>>>
> >>>>> Hi Arthur,
> >>>>>
> >>>>> Do you have snappy libs installed and configured? HBase doesn't come
> >> with
> >>>>> Snappy. So yo need to have it first.
> >>>>>
> >>>>> Shameless plug:
> >>>>>
> >>>>
> >> http://www.spaggiari.org/index.php/hbase/how-to-install-snappy-with-1#.U_xxSqdZuZY
> >>>>>
> >>>>> This is for 0.96 but should be very similar for 0.98. I will try it
> >> soon
> >>>>> and post and update, but keep us posted here so we can support you...
> >>>>>
> >>>>> JM
> >>>>>
> >>>>>
> >>>>> 2014-08-26 7:34 GMT-04:00 Arthur.hk.chan@gmail.com <
> >>>> arthur.hk.chan@gmail.com
> >>>>>> :
> >>>>>
> >>>>>> Hi,
> >>>>>>
> >>>>>> I need to install snappy to HBase 0.98.4.  (my Hadoop version is
> >> 2.4.1)
> >>>>>>
> >>>>>> Can you please advise what would be wrong?  Should my pom.xml be
> >>>> incorrect
> >>>>>> and missing something?
> >>>>>>
> >>>>>> Regards
> >>>>>> Arthur
> >>>>>>
> >>>>>>
> >>>>>> Below are my commands:
> >>>>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 0.98.4-hadoop2
> >>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
> >>>>>> -Prelease,hadoop-snappy
> >>>>>>
> >>>>>> Iog:
> >>>>>> [INFO]
> >>>>>>
> >> ------------------------------------------------------------------------
> >>>>>> [INFO] Building HBase - Server 0.98.4-hadoop2
> >>>>>> [INFO]
> >>>>>>
> >> ------------------------------------------------------------------------
> >>>>>> [WARNING] The POM for
> >> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT
> >>>>>> is missing, no dependency information available
> >>>>>> [INFO]
> >>>>>>
> >> ------------------------------------------------------------------------
> >>>>>> [INFO] Reactor Summary:
> >>>>>> [INFO]
> >>>>>> [INFO] HBase ............................................. SUCCESS
> >>>> [3.129s]
> >>>>>> [INFO] HBase - Common .................................... SUCCESS
> >>>> [3.105s]
> >>>>>> [INFO] HBase - Protocol .................................. SUCCESS
> >>>> [0.976s]
> >>>>>> [INFO] HBase - Client .................................... SUCCESS
> >>>> [0.925s]
> >>>>>> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
> >>>> [0.183s]
> >>>>>> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
> >>>> [0.497s]
> >>>>>> [INFO] HBase - Prefix Tree ............................... SUCCESS
> >>>> [0.407s]
> >>>>>> [INFO] HBase - Server .................................... FAILURE
> >>>> [0.103s]
> >>>>>> [INFO] HBase - Testing Util .............................. SKIPPED
> >>>>>> [INFO] HBase - Thrift .................................... SKIPPED
> >>>>>> [INFO] HBase - Shell ..................................... SKIPPED
> >>>>>> [INFO] HBase - Integration Tests ......................... SKIPPED
> >>>>>> [INFO] HBase - Examples .................................. SKIPPED
> >>>>>> [INFO] HBase - Assembly .................................. SKIPPED
> >>>>>> [INFO]
> >>>>>>
> >> ------------------------------------------------------------------------
> >>>>>> [INFO] BUILD FAILURE
> >>>>>> [INFO]
> >>>>>>
> >> ------------------------------------------------------------------------
> >>>>>> [INFO] Total time: 9.939s
> >>>>>> [INFO] Finished at: Tue Aug 26 19:23:14 HKT 2014
> >>>>>> [INFO] Final Memory: 61M/2921M
> >>>>>> [INFO]
> >>>>>>
> >> ------------------------------------------------------------------------
> >>>>>> [ERROR] Failed to execute goal on project hbase-server: Could not
> >>>> resolve
> >>>>>> dependencies for project
> >>>> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2:
> >>>>>> Failure to find org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
> >>>>>> http://maven.oschina.net/content/groups/public/ was cached in the
> >> local
> >>>>>> repository, resolution will not be reattempted until the update
> >>>> interval of
> >>>>>> nexus-osc has elapsed or updates are forced -> [Help 1]
> >>>>>> [ERROR]
> >>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven with
> >> the
> >>>>>> -e switch.
> >>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> >>>>>> [ERROR]
> >>>>>> [ERROR] For more information about the errors and possible solutions,
> >>>>>> please read the following articles:
> >>>>>> [ERROR] [Help 1]
> >>>>>>
> >>>>
> >> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
> >>>>>> [ERROR]
> >>>>>> [ERROR] After correcting the problems, you can resume the build with
> >> the
> >>>>>> command
> >>>>>> [ERROR]   mvn <goals> -rf :hbase-server
> >>>>>>
> >>>>>>
> >>>>
> >>>>
> >>
> >>
> 
> 
> 
> 
> -- 
> Sean


Re: Compilation error: HBASE 0.98.4 with Snappy

Posted by Sean Busbey <bu...@cloudera.com>.
Hi Arthur!

Our Snappy build instructions are currently out of date and I'm working on
updating them[1]. In short, I don't think there are any special build steps
for using snappy.

I'm still working out what needs to be included in our instructions for
local and cluster testing.

If you use the test for compression options, locally things will fail
because the native hadoop libs won't be present:

bin/hbase org.apache.hadoop.hbase.util.CompressionTest
file://*/tmp/snappy-test* snappy

(for comparison, replace "snappy" with "gz" and you will get a warning
about not having native libraries, but the test will succeed.)


I believe JM's suggestion is for you to copy the Hadoop native libraries
into the local HBase lib/native directory, which would allow the local test
to pass. If you are running in a deployed Hadoop cluster, I would expect
the necessary libraries to already be available to HBase.

[1]: https://issues.apache.org/jira/browse/HBASE-6189

-Sean


On Tue, Aug 26, 2014 at 8:30 AM, Arthur.hk.chan@gmail.com <
arthur.hk.chan@gmail.com> wrote:

> Hi JM
>
> Below are my commands, tried two cases under same source code folder:
> a) compile with snappy parameters(failed),
> b) compile without snappy parameters (successful).
>
> Regards
> Arthur
>
> wget http://mirrors.devlib.org/apache/hbase/stable/hbase-0.98.4-src.tar.gz
> tar -vxf hbase-0.98.4-src.tar.gz
> mv hbase-0.98.4 hbase-0.98.4-src_snappy
> cd  hbase-0.98.4-src_snappy
> nano dev-support/generate-hadoopX-poms.sh
>   (change  hbase_home=“/usr/local/hadoop/hbase-0.98.4-src_snappy”)
>
>
> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 0.98.4-hadoop2
> a) with snappy parameters
> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
> -Prelease,hadoop-snappy -Dhadoop-snappy.version=0.0.1-SNAPSHOT
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Building HBase - Server 0.98.4-hadoop2
> [INFO]
> ------------------------------------------------------------------------
> [WARNING] The POM for org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT
> is missing, no dependency information available
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Reactor Summary:
> [INFO]
> [INFO] HBase ............................................. SUCCESS [8.192s]
> [INFO] HBase - Common .................................... SUCCESS [5.638s]
> [INFO] HBase - Protocol .................................. SUCCESS [1.535s]
> [INFO] HBase - Client .................................... SUCCESS [1.206s]
> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS [0.193s]
> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS [0.798s]
> [INFO] HBase - Prefix Tree ............................... SUCCESS [0.438s]
> [INFO] HBase - Server .................................... FAILURE [0.234s]
> [INFO] HBase - Testing Util .............................. SKIPPED
> [INFO] HBase - Thrift .................................... SKIPPED
> [INFO] HBase - Shell ..................................... SKIPPED
> [INFO] HBase - Integration Tests ......................... SKIPPED
> [INFO] HBase - Examples .................................. SKIPPED
> [INFO] HBase - Assembly .................................. SKIPPED
> [INFO]
> ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Total time: 19.474s
> [INFO] Finished at: Tue Aug 26 21:21:13 HKT 2014
> [INFO] Final Memory: 51M/1100M
> [INFO]
> ------------------------------------------------------------------------
> [ERROR] Failed to execute goal on project hbase-server: Could not resolve
> dependencies for project org.apache.hbase:hbase-server:jar:0.98.4-hadoop2:
> Failure to find org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
> http://maven.oschina.net/content/groups/public/ was cached in the local
> repository, resolution will not be reattempted until the update interval of
> nexus-osc has elapsed or updates are forced -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions,
> please read the following articles:
> [ERROR] [Help 1]
> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
> [ERROR]
> [ERROR] After correcting the problems, you can resume the build with the
> command
> [ERROR]   mvn <goals> -rf :hbase-server
>
>
>
>
> b) try again, without snappy parameters
> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single -Prelease
> [INFO] Building tar:
> /edh/hadoop_all_sources/hbase-0.98.4-src_snappy/hbase-assembly/target/hbase-0.98.4-hadoop2-bin.tar.gz
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Reactor Summary:
> [INFO]
> [INFO] HBase ............................................. SUCCESS [3.290s]
> [INFO] HBase - Common .................................... SUCCESS [3.119s]
> [INFO] HBase - Protocol .................................. SUCCESS [0.972s]
> [INFO] HBase - Client .................................... SUCCESS [0.920s]
> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS [0.167s]
> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS [0.504s]
> [INFO] HBase - Prefix Tree ............................... SUCCESS [0.382s]
> [INFO] HBase - Server .................................... SUCCESS [4.790s]
> [INFO] HBase - Testing Util .............................. SUCCESS [0.598s]
> [INFO] HBase - Thrift .................................... SUCCESS [1.536s]
> [INFO] HBase - Shell ..................................... SUCCESS [0.369s]
> [INFO] HBase - Integration Tests ......................... SUCCESS [0.443s]
> [INFO] HBase - Examples .................................. SUCCESS [0.459s]
> [INFO] HBase - Assembly .................................. SUCCESS
> [13.240s]
> [INFO]
> ------------------------------------------------------------------------
> [INFO] BUILD SUCCESS
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Total time: 31.408s
> [INFO] Finished at: Tue Aug 26 21:22:50 HKT 2014
> [INFO] Final Memory: 57M/1627M
> [INFO]
> ------------------------------------------------------------------------
>
>
>
>
>
> On 26 Aug, 2014, at 8:52 pm, Jean-Marc Spaggiari <je...@spaggiari.org>
> wrote:
>
> > Hi Arthur,
> >
> > How have you extracted HBase source and what command do you run to
> build? I
> > will do the same here locally so I can provide you the exact step to
> > complete.
> >
> > JM
> >
> >
> > 2014-08-26 8:42 GMT-04:00 Arthur.hk.chan@gmail.com <
> arthur.hk.chan@gmail.com
> >> :
> >
> >> Hi JM
> >>
> >> Not too sure what you mean, do you mean I should create a new folder in
> my
> >> HBASE_SRC named lib/native/Linux-x86 and copy these files to this folder
> >> then try to compile it again?
> >>
> >> Regards
> >> ARthur
> >>
> >>
> >> On 26 Aug, 2014, at 8:17 pm, Jean-Marc Spaggiari <
> jean-marc@spaggiari.org>
> >> wrote:
> >>
> >>> Hi Arthur,
> >>>
> >>> Almost done! You now need to copy them on the HBase folder.
> >>>
> >>> hbase@hbasetest1:~/hbase-0.98.2-hadoop2/lib$ tree | grep -v .jar |
> grep
> >> -v
> >>> .rb
> >>> .
> >>> ├── native
> >>> │   └── Linux-x86
> >>> │       ├── libsnappy.a
> >>> │       ├── libsnappy.la
> >>> │       ├── libsnappy.so
> >>> │       ├── libsnappy.so.1
> >>> │       └── libsnappy.so.1.2.0
> >>>
> >>> I don't have any hadoop-snappy lib in my hbase folder and it works very
> >>> well with Snappy for me...
> >>>
> >>> JM
> >>>
> >>> 2014-08-26 8:09 GMT-04:00 Arthur.hk.chan@gmail.com <
> >> arthur.hk.chan@gmail.com
> >>>> :
> >>>
> >>>> Hi JM,
> >>>>
> >>>> Below are my steps to install snappy lib, do I miss something?
> >>>>
> >>>> Regards
> >>>> Arthur
> >>>>
> >>>> wget https://snappy.googlecode.com/files/snappy-1.1.1.tar.gz
> >>>> tar -vxf snappy-1.1.1.tar.gz
> >>>> cd snappy-1.1.1
> >>>> ./configure
> >>>> make
> >>>> make install
> >>>>       make[1]: Entering directory
> >> `/edh/hadoop_all_sources/snappy-1.1.1'
> >>>>       test -z "/usr/local/lib" || /bin/mkdir -p "/usr/local/lib"
> >>>>        /bin/sh ./libtool   --mode=install /usr/bin/install -c
> >>>> libsnappy.la '/usr/local/lib'
> >>>>       libtool: install: /usr/bin/install -c .libs/libsnappy.so.1.2.0
> >>>> /usr/local/lib/libsnappy.so.1.2.0
> >>>>       libtool: install: (cd /usr/local/lib && { ln -s -f
> >>>> libsnappy.so.1.2.0 libsnappy.so.1 || { rm -f libsnappy.so.1 && ln -s
> >>>> libsnappy.so.1.2.0 libsnappy.so.1; }; })
> >>>>       libtool: install: (cd /usr/local/lib && { ln -s -f
> >>>> libsnappy.so.1.2.0 libsnappy.so || { rm -f libsnappy.so && ln -s
> >>>> libsnappy.so.1.2.0 libsnappy.so; }; })
> >>>>       libtool: install: /usr/bin/install -c .libs/libsnappy.lai
> >>>> /usr/local/lib/libsnappy.la
> >>>>       libtool: install: /usr/bin/install -c .libs/libsnappy.a
> >>>> /usr/local/lib/libsnappy.a
> >>>>       libtool: install: chmod 644 /usr/local/lib/libsnappy.a
> >>>>       libtool: install: ranlib /usr/local/lib/libsnappy.a
> >>>>       libtool: finish:
> >>>>
> >>
> PATH="/edh/hadoop/spark/bin:/edh/hadoop/hbase/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/yarn/hadoop/bin:/edh/hadoop/yarn/hadoop/sbin:/usr/lib64/qt-3.3/bin:/opt/apache-maven-3.1.1/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/hive//bin:/usr/lib/jvm/jdk1.6.0_45//bin:/root/bin:/sbin"
> >>>> ldconfig -n /usr/local/lib
> >>>>
> >>>> ----------------------------------------------------------------------
> >>>>       Libraries have been installed in:
> >>>>       /usr/local/lib
> >>>>       If you ever happen to want to link against installed libraries
> >>>>       in a given directory, LIBDIR, you must either use libtool, and
> >>>>       specify the full pathname of the library, or use the `-LLIBDIR'
> >>>>       flag during linking and do at least one of the following:
> >>>>       - add LIBDIR to the `LD_LIBRARY_PATH' environment variable
> >>>>       during execution
> >>>>       - add LIBDIR to the `LD_RUN_PATH' environment variable
> >>>>       during linking
> >>>>       - use the `-Wl,-rpath -Wl,LIBDIR' linker flag
> >>>>       - have your system administrator add LIBDIR to `/etc/ld.so.conf'
> >>>>       See any operating system documentation about shared libraries
> for
> >>>>       more information, such as the ld(1) and ld.so(8) manual pages.
> >>>>
> >>>> ----------------------------------------------------------------------
> >>>>       test -z "/usr/local/share/doc/snappy" || /bin/mkdir -p
> >>>> "/usr/local/share/doc/snappy"
> >>>>        /usr/bin/install -c -m 644 ChangeLog COPYING INSTALL NEWS
> README
> >>>> format_description.txt framing_format.txt
> '/usr/local/share/doc/snappy'
> >>>>       test -z "/usr/local/include" || /bin/mkdir -p
> >> "/usr/local/include"
> >>>>        /usr/bin/install -c -m 644 snappy.h snappy-sinksource.h
> >>>> snappy-stubs-public.h snappy-c.h '/usr/local/include'
> >>>>       make[1]: Leaving directory
> `/edh/hadoop_all_sources/snappy-1.1.1'
> >>>>
> >>>> ll /usr/local/lib
> >>>>       -rw-r--r--. 1 root root   233554 Aug 20 00:14 libsnappy.a
> >>>>       -rwxr-xr-x. 1 root root      953 Aug 20 00:14 libsnappy.la
> >>>>       lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so ->
> >>>> libsnappy.so.1.2.0
> >>>>       lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so.1 ->
> >>>> libsnappy.so.1.2.0
> >>>>       -rwxr-xr-x. 1 root root   147726 Aug 20 00:14 libsnappy.so.1.2.0
> >>>>
> >>>>
> >>>>
> >>>> On 26 Aug, 2014, at 7:38 pm, Jean-Marc Spaggiari <
> >> jean-marc@spaggiari.org>
> >>>> wrote:
> >>>>
> >>>>> Hi Arthur,
> >>>>>
> >>>>> Do you have snappy libs installed and configured? HBase doesn't come
> >> with
> >>>>> Snappy. So yo need to have it first.
> >>>>>
> >>>>> Shameless plug:
> >>>>>
> >>>>
> >>
> http://www.spaggiari.org/index.php/hbase/how-to-install-snappy-with-1#.U_xxSqdZuZY
> >>>>>
> >>>>> This is for 0.96 but should be very similar for 0.98. I will try it
> >> soon
> >>>>> and post and update, but keep us posted here so we can support you...
> >>>>>
> >>>>> JM
> >>>>>
> >>>>>
> >>>>> 2014-08-26 7:34 GMT-04:00 Arthur.hk.chan@gmail.com <
> >>>> arthur.hk.chan@gmail.com
> >>>>>> :
> >>>>>
> >>>>>> Hi,
> >>>>>>
> >>>>>> I need to install snappy to HBase 0.98.4.  (my Hadoop version is
> >> 2.4.1)
> >>>>>>
> >>>>>> Can you please advise what would be wrong?  Should my pom.xml be
> >>>> incorrect
> >>>>>> and missing something?
> >>>>>>
> >>>>>> Regards
> >>>>>> Arthur
> >>>>>>
> >>>>>>
> >>>>>> Below are my commands:
> >>>>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 0.98.4-hadoop2
> >>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
> >>>>>> -Prelease,hadoop-snappy
> >>>>>>
> >>>>>> Iog:
> >>>>>> [INFO]
> >>>>>>
> >> ------------------------------------------------------------------------
> >>>>>> [INFO] Building HBase - Server 0.98.4-hadoop2
> >>>>>> [INFO]
> >>>>>>
> >> ------------------------------------------------------------------------
> >>>>>> [WARNING] The POM for
> >> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT
> >>>>>> is missing, no dependency information available
> >>>>>> [INFO]
> >>>>>>
> >> ------------------------------------------------------------------------
> >>>>>> [INFO] Reactor Summary:
> >>>>>> [INFO]
> >>>>>> [INFO] HBase ............................................. SUCCESS
> >>>> [3.129s]
> >>>>>> [INFO] HBase - Common .................................... SUCCESS
> >>>> [3.105s]
> >>>>>> [INFO] HBase - Protocol .................................. SUCCESS
> >>>> [0.976s]
> >>>>>> [INFO] HBase - Client .................................... SUCCESS
> >>>> [0.925s]
> >>>>>> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
> >>>> [0.183s]
> >>>>>> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
> >>>> [0.497s]
> >>>>>> [INFO] HBase - Prefix Tree ............................... SUCCESS
> >>>> [0.407s]
> >>>>>> [INFO] HBase - Server .................................... FAILURE
> >>>> [0.103s]
> >>>>>> [INFO] HBase - Testing Util .............................. SKIPPED
> >>>>>> [INFO] HBase - Thrift .................................... SKIPPED
> >>>>>> [INFO] HBase - Shell ..................................... SKIPPED
> >>>>>> [INFO] HBase - Integration Tests ......................... SKIPPED
> >>>>>> [INFO] HBase - Examples .................................. SKIPPED
> >>>>>> [INFO] HBase - Assembly .................................. SKIPPED
> >>>>>> [INFO]
> >>>>>>
> >> ------------------------------------------------------------------------
> >>>>>> [INFO] BUILD FAILURE
> >>>>>> [INFO]
> >>>>>>
> >> ------------------------------------------------------------------------
> >>>>>> [INFO] Total time: 9.939s
> >>>>>> [INFO] Finished at: Tue Aug 26 19:23:14 HKT 2014
> >>>>>> [INFO] Final Memory: 61M/2921M
> >>>>>> [INFO]
> >>>>>>
> >> ------------------------------------------------------------------------
> >>>>>> [ERROR] Failed to execute goal on project hbase-server: Could not
> >>>> resolve
> >>>>>> dependencies for project
> >>>> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2:
> >>>>>> Failure to find org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT
> in
> >>>>>> http://maven.oschina.net/content/groups/public/ was cached in the
> >> local
> >>>>>> repository, resolution will not be reattempted until the update
> >>>> interval of
> >>>>>> nexus-osc has elapsed or updates are forced -> [Help 1]
> >>>>>> [ERROR]
> >>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven with
> >> the
> >>>>>> -e switch.
> >>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug
> logging.
> >>>>>> [ERROR]
> >>>>>> [ERROR] For more information about the errors and possible
> solutions,
> >>>>>> please read the following articles:
> >>>>>> [ERROR] [Help 1]
> >>>>>>
> >>>>
> >>
> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
> >>>>>> [ERROR]
> >>>>>> [ERROR] After correcting the problems, you can resume the build with
> >> the
> >>>>>> command
> >>>>>> [ERROR]   mvn <goals> -rf :hbase-server
> >>>>>>
> >>>>>>
> >>>>
> >>>>
> >>
> >>
>
>


-- 
Sean

Re: Compilation error: HBASE 0.98.4 with Snappy

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
Hi JM

Below are my commands, tried two cases under same source code folder: 
a) compile with snappy parameters(failed),  
b) compile without snappy parameters (successful).

Regards
Arthur

wget http://mirrors.devlib.org/apache/hbase/stable/hbase-0.98.4-src.tar.gz
tar -vxf hbase-0.98.4-src.tar.gz
mv hbase-0.98.4 hbase-0.98.4-src_snappy
cd  hbase-0.98.4-src_snappy
nano dev-support/generate-hadoopX-poms.sh
  (change  hbase_home=“/usr/local/hadoop/hbase-0.98.4-src_snappy”)


bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 0.98.4-hadoop2
a) with snappy parameters
mvn -f pom.xml.hadoop2 install -DskipTests assembly:single -Prelease,hadoop-snappy -Dhadoop-snappy.version=0.0.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] Building HBase - Server 0.98.4-hadoop2
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT is missing, no dependency information available
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] HBase ............................................. SUCCESS [8.192s]
[INFO] HBase - Common .................................... SUCCESS [5.638s]
[INFO] HBase - Protocol .................................. SUCCESS [1.535s]
[INFO] HBase - Client .................................... SUCCESS [1.206s]
[INFO] HBase - Hadoop Compatibility ...................... SUCCESS [0.193s]
[INFO] HBase - Hadoop Two Compatibility .................. SUCCESS [0.798s]
[INFO] HBase - Prefix Tree ............................... SUCCESS [0.438s]
[INFO] HBase - Server .................................... FAILURE [0.234s]
[INFO] HBase - Testing Util .............................. SKIPPED
[INFO] HBase - Thrift .................................... SKIPPED
[INFO] HBase - Shell ..................................... SKIPPED
[INFO] HBase - Integration Tests ......................... SKIPPED
[INFO] HBase - Examples .................................. SKIPPED
[INFO] HBase - Assembly .................................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 19.474s
[INFO] Finished at: Tue Aug 26 21:21:13 HKT 2014
[INFO] Final Memory: 51M/1100M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project hbase-server: Could not resolve dependencies for project org.apache.hbase:hbase-server:jar:0.98.4-hadoop2: Failure to find org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in http://maven.oschina.net/content/groups/public/ was cached in the local repository, resolution will not be reattempted until the update interval of nexus-osc has elapsed or updates are forced -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hbase-server




b) try again, without snappy parameters
mvn -f pom.xml.hadoop2 install -DskipTests assembly:single -Prelease
[INFO] Building tar: /edh/hadoop_all_sources/hbase-0.98.4-src_snappy/hbase-assembly/target/hbase-0.98.4-hadoop2-bin.tar.gz
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] HBase ............................................. SUCCESS [3.290s]
[INFO] HBase - Common .................................... SUCCESS [3.119s]
[INFO] HBase - Protocol .................................. SUCCESS [0.972s]
[INFO] HBase - Client .................................... SUCCESS [0.920s]
[INFO] HBase - Hadoop Compatibility ...................... SUCCESS [0.167s]
[INFO] HBase - Hadoop Two Compatibility .................. SUCCESS [0.504s]
[INFO] HBase - Prefix Tree ............................... SUCCESS [0.382s]
[INFO] HBase - Server .................................... SUCCESS [4.790s]
[INFO] HBase - Testing Util .............................. SUCCESS [0.598s]
[INFO] HBase - Thrift .................................... SUCCESS [1.536s]
[INFO] HBase - Shell ..................................... SUCCESS [0.369s]
[INFO] HBase - Integration Tests ......................... SUCCESS [0.443s]
[INFO] HBase - Examples .................................. SUCCESS [0.459s]
[INFO] HBase - Assembly .................................. SUCCESS [13.240s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 31.408s
[INFO] Finished at: Tue Aug 26 21:22:50 HKT 2014
[INFO] Final Memory: 57M/1627M
[INFO] ------------------------------------------------------------------------





On 26 Aug, 2014, at 8:52 pm, Jean-Marc Spaggiari <je...@spaggiari.org> wrote:

> Hi Arthur,
> 
> How have you extracted HBase source and what command do you run to build? I
> will do the same here locally so I can provide you the exact step to
> complete.
> 
> JM
> 
> 
> 2014-08-26 8:42 GMT-04:00 Arthur.hk.chan@gmail.com <arthur.hk.chan@gmail.com
>> :
> 
>> Hi JM
>> 
>> Not too sure what you mean, do you mean I should create a new folder in my
>> HBASE_SRC named lib/native/Linux-x86 and copy these files to this folder
>> then try to compile it again?
>> 
>> Regards
>> ARthur
>> 
>> 
>> On 26 Aug, 2014, at 8:17 pm, Jean-Marc Spaggiari <je...@spaggiari.org>
>> wrote:
>> 
>>> Hi Arthur,
>>> 
>>> Almost done! You now need to copy them on the HBase folder.
>>> 
>>> hbase@hbasetest1:~/hbase-0.98.2-hadoop2/lib$ tree | grep -v .jar | grep
>> -v
>>> .rb
>>> .
>>> ├── native
>>> │   └── Linux-x86
>>> │       ├── libsnappy.a
>>> │       ├── libsnappy.la
>>> │       ├── libsnappy.so
>>> │       ├── libsnappy.so.1
>>> │       └── libsnappy.so.1.2.0
>>> 
>>> I don't have any hadoop-snappy lib in my hbase folder and it works very
>>> well with Snappy for me...
>>> 
>>> JM
>>> 
>>> 2014-08-26 8:09 GMT-04:00 Arthur.hk.chan@gmail.com <
>> arthur.hk.chan@gmail.com
>>>> :
>>> 
>>>> Hi JM,
>>>> 
>>>> Below are my steps to install snappy lib, do I miss something?
>>>> 
>>>> Regards
>>>> Arthur
>>>> 
>>>> wget https://snappy.googlecode.com/files/snappy-1.1.1.tar.gz
>>>> tar -vxf snappy-1.1.1.tar.gz
>>>> cd snappy-1.1.1
>>>> ./configure
>>>> make
>>>> make install
>>>>       make[1]: Entering directory
>> `/edh/hadoop_all_sources/snappy-1.1.1'
>>>>       test -z "/usr/local/lib" || /bin/mkdir -p "/usr/local/lib"
>>>>        /bin/sh ./libtool   --mode=install /usr/bin/install -c
>>>> libsnappy.la '/usr/local/lib'
>>>>       libtool: install: /usr/bin/install -c .libs/libsnappy.so.1.2.0
>>>> /usr/local/lib/libsnappy.so.1.2.0
>>>>       libtool: install: (cd /usr/local/lib && { ln -s -f
>>>> libsnappy.so.1.2.0 libsnappy.so.1 || { rm -f libsnappy.so.1 && ln -s
>>>> libsnappy.so.1.2.0 libsnappy.so.1; }; })
>>>>       libtool: install: (cd /usr/local/lib && { ln -s -f
>>>> libsnappy.so.1.2.0 libsnappy.so || { rm -f libsnappy.so && ln -s
>>>> libsnappy.so.1.2.0 libsnappy.so; }; })
>>>>       libtool: install: /usr/bin/install -c .libs/libsnappy.lai
>>>> /usr/local/lib/libsnappy.la
>>>>       libtool: install: /usr/bin/install -c .libs/libsnappy.a
>>>> /usr/local/lib/libsnappy.a
>>>>       libtool: install: chmod 644 /usr/local/lib/libsnappy.a
>>>>       libtool: install: ranlib /usr/local/lib/libsnappy.a
>>>>       libtool: finish:
>>>> 
>> PATH="/edh/hadoop/spark/bin:/edh/hadoop/hbase/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/yarn/hadoop/bin:/edh/hadoop/yarn/hadoop/sbin:/usr/lib64/qt-3.3/bin:/opt/apache-maven-3.1.1/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/hive//bin:/usr/lib/jvm/jdk1.6.0_45//bin:/root/bin:/sbin"
>>>> ldconfig -n /usr/local/lib
>>>> 
>>>> ----------------------------------------------------------------------
>>>>       Libraries have been installed in:
>>>>       /usr/local/lib
>>>>       If you ever happen to want to link against installed libraries
>>>>       in a given directory, LIBDIR, you must either use libtool, and
>>>>       specify the full pathname of the library, or use the `-LLIBDIR'
>>>>       flag during linking and do at least one of the following:
>>>>       - add LIBDIR to the `LD_LIBRARY_PATH' environment variable
>>>>       during execution
>>>>       - add LIBDIR to the `LD_RUN_PATH' environment variable
>>>>       during linking
>>>>       - use the `-Wl,-rpath -Wl,LIBDIR' linker flag
>>>>       - have your system administrator add LIBDIR to `/etc/ld.so.conf'
>>>>       See any operating system documentation about shared libraries for
>>>>       more information, such as the ld(1) and ld.so(8) manual pages.
>>>> 
>>>> ----------------------------------------------------------------------
>>>>       test -z "/usr/local/share/doc/snappy" || /bin/mkdir -p
>>>> "/usr/local/share/doc/snappy"
>>>>        /usr/bin/install -c -m 644 ChangeLog COPYING INSTALL NEWS README
>>>> format_description.txt framing_format.txt '/usr/local/share/doc/snappy'
>>>>       test -z "/usr/local/include" || /bin/mkdir -p
>> "/usr/local/include"
>>>>        /usr/bin/install -c -m 644 snappy.h snappy-sinksource.h
>>>> snappy-stubs-public.h snappy-c.h '/usr/local/include'
>>>>       make[1]: Leaving directory `/edh/hadoop_all_sources/snappy-1.1.1'
>>>> 
>>>> ll /usr/local/lib
>>>>       -rw-r--r--. 1 root root   233554 Aug 20 00:14 libsnappy.a
>>>>       -rwxr-xr-x. 1 root root      953 Aug 20 00:14 libsnappy.la
>>>>       lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so ->
>>>> libsnappy.so.1.2.0
>>>>       lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so.1 ->
>>>> libsnappy.so.1.2.0
>>>>       -rwxr-xr-x. 1 root root   147726 Aug 20 00:14 libsnappy.so.1.2.0
>>>> 
>>>> 
>>>> 
>>>> On 26 Aug, 2014, at 7:38 pm, Jean-Marc Spaggiari <
>> jean-marc@spaggiari.org>
>>>> wrote:
>>>> 
>>>>> Hi Arthur,
>>>>> 
>>>>> Do you have snappy libs installed and configured? HBase doesn't come
>> with
>>>>> Snappy. So yo need to have it first.
>>>>> 
>>>>> Shameless plug:
>>>>> 
>>>> 
>> http://www.spaggiari.org/index.php/hbase/how-to-install-snappy-with-1#.U_xxSqdZuZY
>>>>> 
>>>>> This is for 0.96 but should be very similar for 0.98. I will try it
>> soon
>>>>> and post and update, but keep us posted here so we can support you...
>>>>> 
>>>>> JM
>>>>> 
>>>>> 
>>>>> 2014-08-26 7:34 GMT-04:00 Arthur.hk.chan@gmail.com <
>>>> arthur.hk.chan@gmail.com
>>>>>> :
>>>>> 
>>>>>> Hi,
>>>>>> 
>>>>>> I need to install snappy to HBase 0.98.4.  (my Hadoop version is
>> 2.4.1)
>>>>>> 
>>>>>> Can you please advise what would be wrong?  Should my pom.xml be
>>>> incorrect
>>>>>> and missing something?
>>>>>> 
>>>>>> Regards
>>>>>> Arthur
>>>>>> 
>>>>>> 
>>>>>> Below are my commands:
>>>>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 0.98.4-hadoop2
>>>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
>>>>>> -Prelease,hadoop-snappy
>>>>>> 
>>>>>> Iog:
>>>>>> [INFO]
>>>>>> 
>> ------------------------------------------------------------------------
>>>>>> [INFO] Building HBase - Server 0.98.4-hadoop2
>>>>>> [INFO]
>>>>>> 
>> ------------------------------------------------------------------------
>>>>>> [WARNING] The POM for
>> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT
>>>>>> is missing, no dependency information available
>>>>>> [INFO]
>>>>>> 
>> ------------------------------------------------------------------------
>>>>>> [INFO] Reactor Summary:
>>>>>> [INFO]
>>>>>> [INFO] HBase ............................................. SUCCESS
>>>> [3.129s]
>>>>>> [INFO] HBase - Common .................................... SUCCESS
>>>> [3.105s]
>>>>>> [INFO] HBase - Protocol .................................. SUCCESS
>>>> [0.976s]
>>>>>> [INFO] HBase - Client .................................... SUCCESS
>>>> [0.925s]
>>>>>> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
>>>> [0.183s]
>>>>>> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
>>>> [0.497s]
>>>>>> [INFO] HBase - Prefix Tree ............................... SUCCESS
>>>> [0.407s]
>>>>>> [INFO] HBase - Server .................................... FAILURE
>>>> [0.103s]
>>>>>> [INFO] HBase - Testing Util .............................. SKIPPED
>>>>>> [INFO] HBase - Thrift .................................... SKIPPED
>>>>>> [INFO] HBase - Shell ..................................... SKIPPED
>>>>>> [INFO] HBase - Integration Tests ......................... SKIPPED
>>>>>> [INFO] HBase - Examples .................................. SKIPPED
>>>>>> [INFO] HBase - Assembly .................................. SKIPPED
>>>>>> [INFO]
>>>>>> 
>> ------------------------------------------------------------------------
>>>>>> [INFO] BUILD FAILURE
>>>>>> [INFO]
>>>>>> 
>> ------------------------------------------------------------------------
>>>>>> [INFO] Total time: 9.939s
>>>>>> [INFO] Finished at: Tue Aug 26 19:23:14 HKT 2014
>>>>>> [INFO] Final Memory: 61M/2921M
>>>>>> [INFO]
>>>>>> 
>> ------------------------------------------------------------------------
>>>>>> [ERROR] Failed to execute goal on project hbase-server: Could not
>>>> resolve
>>>>>> dependencies for project
>>>> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2:
>>>>>> Failure to find org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
>>>>>> http://maven.oschina.net/content/groups/public/ was cached in the
>> local
>>>>>> repository, resolution will not be reattempted until the update
>>>> interval of
>>>>>> nexus-osc has elapsed or updates are forced -> [Help 1]
>>>>>> [ERROR]
>>>>>> [ERROR] To see the full stack trace of the errors, re-run Maven with
>> the
>>>>>> -e switch.
>>>>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>>>>> [ERROR]
>>>>>> [ERROR] For more information about the errors and possible solutions,
>>>>>> please read the following articles:
>>>>>> [ERROR] [Help 1]
>>>>>> 
>>>> 
>> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
>>>>>> [ERROR]
>>>>>> [ERROR] After correcting the problems, you can resume the build with
>> the
>>>>>> command
>>>>>> [ERROR]   mvn <goals> -rf :hbase-server
>>>>>> 
>>>>>> 
>>>> 
>>>> 
>> 
>> 


Re: Compilation error: HBASE 0.98.4 with Snappy

Posted by Jean-Marc Spaggiari <je...@spaggiari.org>.
Hi Arthur,

How have you extracted HBase source and what command do you run to build? I
will do the same here locally so I can provide you the exact step to
complete.

JM


2014-08-26 8:42 GMT-04:00 Arthur.hk.chan@gmail.com <arthur.hk.chan@gmail.com
>:

> Hi JM
>
> Not too sure what you mean, do you mean I should create a new folder in my
> HBASE_SRC named lib/native/Linux-x86 and copy these files to this folder
> then try to compile it again?
>
> Regards
> ARthur
>
>
> On 26 Aug, 2014, at 8:17 pm, Jean-Marc Spaggiari <je...@spaggiari.org>
> wrote:
>
> > Hi Arthur,
> >
> > Almost done! You now need to copy them on the HBase folder.
> >
> > hbase@hbasetest1:~/hbase-0.98.2-hadoop2/lib$ tree | grep -v .jar | grep
> -v
> > .rb
> > .
> > ├── native
> > │   └── Linux-x86
> > │       ├── libsnappy.a
> > │       ├── libsnappy.la
> > │       ├── libsnappy.so
> > │       ├── libsnappy.so.1
> > │       └── libsnappy.so.1.2.0
> >
> > I don't have any hadoop-snappy lib in my hbase folder and it works very
> > well with Snappy for me...
> >
> > JM
> >
> > 2014-08-26 8:09 GMT-04:00 Arthur.hk.chan@gmail.com <
> arthur.hk.chan@gmail.com
> >> :
> >
> >> Hi JM,
> >>
> >> Below are my steps to install snappy lib, do I miss something?
> >>
> >> Regards
> >> Arthur
> >>
> >> wget https://snappy.googlecode.com/files/snappy-1.1.1.tar.gz
> >> tar -vxf snappy-1.1.1.tar.gz
> >> cd snappy-1.1.1
> >> ./configure
> >> make
> >> make install
> >>        make[1]: Entering directory
> `/edh/hadoop_all_sources/snappy-1.1.1'
> >>        test -z "/usr/local/lib" || /bin/mkdir -p "/usr/local/lib"
> >>         /bin/sh ./libtool   --mode=install /usr/bin/install -c
> >> libsnappy.la '/usr/local/lib'
> >>        libtool: install: /usr/bin/install -c .libs/libsnappy.so.1.2.0
> >> /usr/local/lib/libsnappy.so.1.2.0
> >>        libtool: install: (cd /usr/local/lib && { ln -s -f
> >> libsnappy.so.1.2.0 libsnappy.so.1 || { rm -f libsnappy.so.1 && ln -s
> >> libsnappy.so.1.2.0 libsnappy.so.1; }; })
> >>        libtool: install: (cd /usr/local/lib && { ln -s -f
> >> libsnappy.so.1.2.0 libsnappy.so || { rm -f libsnappy.so && ln -s
> >> libsnappy.so.1.2.0 libsnappy.so; }; })
> >>        libtool: install: /usr/bin/install -c .libs/libsnappy.lai
> >> /usr/local/lib/libsnappy.la
> >>        libtool: install: /usr/bin/install -c .libs/libsnappy.a
> >> /usr/local/lib/libsnappy.a
> >>        libtool: install: chmod 644 /usr/local/lib/libsnappy.a
> >>        libtool: install: ranlib /usr/local/lib/libsnappy.a
> >>        libtool: finish:
> >>
> PATH="/edh/hadoop/spark/bin:/edh/hadoop/hbase/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/yarn/hadoop/bin:/edh/hadoop/yarn/hadoop/sbin:/usr/lib64/qt-3.3/bin:/opt/apache-maven-3.1.1/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/hive//bin:/usr/lib/jvm/jdk1.6.0_45//bin:/root/bin:/sbin"
> >> ldconfig -n /usr/local/lib
> >>
> >> ----------------------------------------------------------------------
> >>        Libraries have been installed in:
> >>        /usr/local/lib
> >>        If you ever happen to want to link against installed libraries
> >>        in a given directory, LIBDIR, you must either use libtool, and
> >>        specify the full pathname of the library, or use the `-LLIBDIR'
> >>        flag during linking and do at least one of the following:
> >>        - add LIBDIR to the `LD_LIBRARY_PATH' environment variable
> >>        during execution
> >>        - add LIBDIR to the `LD_RUN_PATH' environment variable
> >>        during linking
> >>        - use the `-Wl,-rpath -Wl,LIBDIR' linker flag
> >>        - have your system administrator add LIBDIR to `/etc/ld.so.conf'
> >>        See any operating system documentation about shared libraries for
> >>        more information, such as the ld(1) and ld.so(8) manual pages.
> >>
> >> ----------------------------------------------------------------------
> >>        test -z "/usr/local/share/doc/snappy" || /bin/mkdir -p
> >> "/usr/local/share/doc/snappy"
> >>         /usr/bin/install -c -m 644 ChangeLog COPYING INSTALL NEWS README
> >> format_description.txt framing_format.txt '/usr/local/share/doc/snappy'
> >>        test -z "/usr/local/include" || /bin/mkdir -p
> "/usr/local/include"
> >>         /usr/bin/install -c -m 644 snappy.h snappy-sinksource.h
> >> snappy-stubs-public.h snappy-c.h '/usr/local/include'
> >>        make[1]: Leaving directory `/edh/hadoop_all_sources/snappy-1.1.1'
> >>
> >> ll /usr/local/lib
> >>        -rw-r--r--. 1 root root   233554 Aug 20 00:14 libsnappy.a
> >>        -rwxr-xr-x. 1 root root      953 Aug 20 00:14 libsnappy.la
> >>        lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so ->
> >> libsnappy.so.1.2.0
> >>        lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so.1 ->
> >> libsnappy.so.1.2.0
> >>        -rwxr-xr-x. 1 root root   147726 Aug 20 00:14 libsnappy.so.1.2.0
> >>
> >>
> >>
> >> On 26 Aug, 2014, at 7:38 pm, Jean-Marc Spaggiari <
> jean-marc@spaggiari.org>
> >> wrote:
> >>
> >>> Hi Arthur,
> >>>
> >>> Do you have snappy libs installed and configured? HBase doesn't come
> with
> >>> Snappy. So yo need to have it first.
> >>>
> >>> Shameless plug:
> >>>
> >>
> http://www.spaggiari.org/index.php/hbase/how-to-install-snappy-with-1#.U_xxSqdZuZY
> >>>
> >>> This is for 0.96 but should be very similar for 0.98. I will try it
> soon
> >>> and post and update, but keep us posted here so we can support you...
> >>>
> >>> JM
> >>>
> >>>
> >>> 2014-08-26 7:34 GMT-04:00 Arthur.hk.chan@gmail.com <
> >> arthur.hk.chan@gmail.com
> >>>> :
> >>>
> >>>> Hi,
> >>>>
> >>>> I need to install snappy to HBase 0.98.4.  (my Hadoop version is
> 2.4.1)
> >>>>
> >>>> Can you please advise what would be wrong?  Should my pom.xml be
> >> incorrect
> >>>> and missing something?
> >>>>
> >>>> Regards
> >>>> Arthur
> >>>>
> >>>>
> >>>> Below are my commands:
> >>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 0.98.4-hadoop2
> >>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
> >>>> -Prelease,hadoop-snappy
> >>>>
> >>>> Iog:
> >>>> [INFO]
> >>>>
> ------------------------------------------------------------------------
> >>>> [INFO] Building HBase - Server 0.98.4-hadoop2
> >>>> [INFO]
> >>>>
> ------------------------------------------------------------------------
> >>>> [WARNING] The POM for
> org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT
> >>>> is missing, no dependency information available
> >>>> [INFO]
> >>>>
> ------------------------------------------------------------------------
> >>>> [INFO] Reactor Summary:
> >>>> [INFO]
> >>>> [INFO] HBase ............................................. SUCCESS
> >> [3.129s]
> >>>> [INFO] HBase - Common .................................... SUCCESS
> >> [3.105s]
> >>>> [INFO] HBase - Protocol .................................. SUCCESS
> >> [0.976s]
> >>>> [INFO] HBase - Client .................................... SUCCESS
> >> [0.925s]
> >>>> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
> >> [0.183s]
> >>>> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
> >> [0.497s]
> >>>> [INFO] HBase - Prefix Tree ............................... SUCCESS
> >> [0.407s]
> >>>> [INFO] HBase - Server .................................... FAILURE
> >> [0.103s]
> >>>> [INFO] HBase - Testing Util .............................. SKIPPED
> >>>> [INFO] HBase - Thrift .................................... SKIPPED
> >>>> [INFO] HBase - Shell ..................................... SKIPPED
> >>>> [INFO] HBase - Integration Tests ......................... SKIPPED
> >>>> [INFO] HBase - Examples .................................. SKIPPED
> >>>> [INFO] HBase - Assembly .................................. SKIPPED
> >>>> [INFO]
> >>>>
> ------------------------------------------------------------------------
> >>>> [INFO] BUILD FAILURE
> >>>> [INFO]
> >>>>
> ------------------------------------------------------------------------
> >>>> [INFO] Total time: 9.939s
> >>>> [INFO] Finished at: Tue Aug 26 19:23:14 HKT 2014
> >>>> [INFO] Final Memory: 61M/2921M
> >>>> [INFO]
> >>>>
> ------------------------------------------------------------------------
> >>>> [ERROR] Failed to execute goal on project hbase-server: Could not
> >> resolve
> >>>> dependencies for project
> >> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2:
> >>>> Failure to find org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
> >>>> http://maven.oschina.net/content/groups/public/ was cached in the
> local
> >>>> repository, resolution will not be reattempted until the update
> >> interval of
> >>>> nexus-osc has elapsed or updates are forced -> [Help 1]
> >>>> [ERROR]
> >>>> [ERROR] To see the full stack trace of the errors, re-run Maven with
> the
> >>>> -e switch.
> >>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> >>>> [ERROR]
> >>>> [ERROR] For more information about the errors and possible solutions,
> >>>> please read the following articles:
> >>>> [ERROR] [Help 1]
> >>>>
> >>
> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
> >>>> [ERROR]
> >>>> [ERROR] After correcting the problems, you can resume the build with
> the
> >>>> command
> >>>> [ERROR]   mvn <goals> -rf :hbase-server
> >>>>
> >>>>
> >>
> >>
>
>

Re: Compilation error: HBASE 0.98.4 with Snappy

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
Hi JM

Not too sure what you mean, do you mean I should create a new folder in my HBASE_SRC named lib/native/Linux-x86 and copy these files to this folder then try to compile it again?

Regards
ARthur


On 26 Aug, 2014, at 8:17 pm, Jean-Marc Spaggiari <je...@spaggiari.org> wrote:

> Hi Arthur,
> 
> Almost done! You now need to copy them on the HBase folder.
> 
> hbase@hbasetest1:~/hbase-0.98.2-hadoop2/lib$ tree | grep -v .jar | grep -v
> .rb
> .
> ├── native
> │   └── Linux-x86
> │       ├── libsnappy.a
> │       ├── libsnappy.la
> │       ├── libsnappy.so
> │       ├── libsnappy.so.1
> │       └── libsnappy.so.1.2.0
> 
> I don't have any hadoop-snappy lib in my hbase folder and it works very
> well with Snappy for me...
> 
> JM
> 
> 2014-08-26 8:09 GMT-04:00 Arthur.hk.chan@gmail.com <arthur.hk.chan@gmail.com
>> :
> 
>> Hi JM,
>> 
>> Below are my steps to install snappy lib, do I miss something?
>> 
>> Regards
>> Arthur
>> 
>> wget https://snappy.googlecode.com/files/snappy-1.1.1.tar.gz
>> tar -vxf snappy-1.1.1.tar.gz
>> cd snappy-1.1.1
>> ./configure
>> make
>> make install
>>        make[1]: Entering directory `/edh/hadoop_all_sources/snappy-1.1.1'
>>        test -z "/usr/local/lib" || /bin/mkdir -p "/usr/local/lib"
>>         /bin/sh ./libtool   --mode=install /usr/bin/install -c
>> libsnappy.la '/usr/local/lib'
>>        libtool: install: /usr/bin/install -c .libs/libsnappy.so.1.2.0
>> /usr/local/lib/libsnappy.so.1.2.0
>>        libtool: install: (cd /usr/local/lib && { ln -s -f
>> libsnappy.so.1.2.0 libsnappy.so.1 || { rm -f libsnappy.so.1 && ln -s
>> libsnappy.so.1.2.0 libsnappy.so.1; }; })
>>        libtool: install: (cd /usr/local/lib && { ln -s -f
>> libsnappy.so.1.2.0 libsnappy.so || { rm -f libsnappy.so && ln -s
>> libsnappy.so.1.2.0 libsnappy.so; }; })
>>        libtool: install: /usr/bin/install -c .libs/libsnappy.lai
>> /usr/local/lib/libsnappy.la
>>        libtool: install: /usr/bin/install -c .libs/libsnappy.a
>> /usr/local/lib/libsnappy.a
>>        libtool: install: chmod 644 /usr/local/lib/libsnappy.a
>>        libtool: install: ranlib /usr/local/lib/libsnappy.a
>>        libtool: finish:
>> PATH="/edh/hadoop/spark/bin:/edh/hadoop/hbase/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/yarn/hadoop/bin:/edh/hadoop/yarn/hadoop/sbin:/usr/lib64/qt-3.3/bin:/opt/apache-maven-3.1.1/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/hive//bin:/usr/lib/jvm/jdk1.6.0_45//bin:/root/bin:/sbin"
>> ldconfig -n /usr/local/lib
>> 
>> ----------------------------------------------------------------------
>>        Libraries have been installed in:
>>        /usr/local/lib
>>        If you ever happen to want to link against installed libraries
>>        in a given directory, LIBDIR, you must either use libtool, and
>>        specify the full pathname of the library, or use the `-LLIBDIR'
>>        flag during linking and do at least one of the following:
>>        - add LIBDIR to the `LD_LIBRARY_PATH' environment variable
>>        during execution
>>        - add LIBDIR to the `LD_RUN_PATH' environment variable
>>        during linking
>>        - use the `-Wl,-rpath -Wl,LIBDIR' linker flag
>>        - have your system administrator add LIBDIR to `/etc/ld.so.conf'
>>        See any operating system documentation about shared libraries for
>>        more information, such as the ld(1) and ld.so(8) manual pages.
>> 
>> ----------------------------------------------------------------------
>>        test -z "/usr/local/share/doc/snappy" || /bin/mkdir -p
>> "/usr/local/share/doc/snappy"
>>         /usr/bin/install -c -m 644 ChangeLog COPYING INSTALL NEWS README
>> format_description.txt framing_format.txt '/usr/local/share/doc/snappy'
>>        test -z "/usr/local/include" || /bin/mkdir -p "/usr/local/include"
>>         /usr/bin/install -c -m 644 snappy.h snappy-sinksource.h
>> snappy-stubs-public.h snappy-c.h '/usr/local/include'
>>        make[1]: Leaving directory `/edh/hadoop_all_sources/snappy-1.1.1'
>> 
>> ll /usr/local/lib
>>        -rw-r--r--. 1 root root   233554 Aug 20 00:14 libsnappy.a
>>        -rwxr-xr-x. 1 root root      953 Aug 20 00:14 libsnappy.la
>>        lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so ->
>> libsnappy.so.1.2.0
>>        lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so.1 ->
>> libsnappy.so.1.2.0
>>        -rwxr-xr-x. 1 root root   147726 Aug 20 00:14 libsnappy.so.1.2.0
>> 
>> 
>> 
>> On 26 Aug, 2014, at 7:38 pm, Jean-Marc Spaggiari <je...@spaggiari.org>
>> wrote:
>> 
>>> Hi Arthur,
>>> 
>>> Do you have snappy libs installed and configured? HBase doesn't come with
>>> Snappy. So yo need to have it first.
>>> 
>>> Shameless plug:
>>> 
>> http://www.spaggiari.org/index.php/hbase/how-to-install-snappy-with-1#.U_xxSqdZuZY
>>> 
>>> This is for 0.96 but should be very similar for 0.98. I will try it soon
>>> and post and update, but keep us posted here so we can support you...
>>> 
>>> JM
>>> 
>>> 
>>> 2014-08-26 7:34 GMT-04:00 Arthur.hk.chan@gmail.com <
>> arthur.hk.chan@gmail.com
>>>> :
>>> 
>>>> Hi,
>>>> 
>>>> I need to install snappy to HBase 0.98.4.  (my Hadoop version is 2.4.1)
>>>> 
>>>> Can you please advise what would be wrong?  Should my pom.xml be
>> incorrect
>>>> and missing something?
>>>> 
>>>> Regards
>>>> Arthur
>>>> 
>>>> 
>>>> Below are my commands:
>>>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 0.98.4-hadoop2
>>>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
>>>> -Prelease,hadoop-snappy
>>>> 
>>>> Iog:
>>>> [INFO]
>>>> ------------------------------------------------------------------------
>>>> [INFO] Building HBase - Server 0.98.4-hadoop2
>>>> [INFO]
>>>> ------------------------------------------------------------------------
>>>> [WARNING] The POM for org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT
>>>> is missing, no dependency information available
>>>> [INFO]
>>>> ------------------------------------------------------------------------
>>>> [INFO] Reactor Summary:
>>>> [INFO]
>>>> [INFO] HBase ............................................. SUCCESS
>> [3.129s]
>>>> [INFO] HBase - Common .................................... SUCCESS
>> [3.105s]
>>>> [INFO] HBase - Protocol .................................. SUCCESS
>> [0.976s]
>>>> [INFO] HBase - Client .................................... SUCCESS
>> [0.925s]
>>>> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
>> [0.183s]
>>>> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
>> [0.497s]
>>>> [INFO] HBase - Prefix Tree ............................... SUCCESS
>> [0.407s]
>>>> [INFO] HBase - Server .................................... FAILURE
>> [0.103s]
>>>> [INFO] HBase - Testing Util .............................. SKIPPED
>>>> [INFO] HBase - Thrift .................................... SKIPPED
>>>> [INFO] HBase - Shell ..................................... SKIPPED
>>>> [INFO] HBase - Integration Tests ......................... SKIPPED
>>>> [INFO] HBase - Examples .................................. SKIPPED
>>>> [INFO] HBase - Assembly .................................. SKIPPED
>>>> [INFO]
>>>> ------------------------------------------------------------------------
>>>> [INFO] BUILD FAILURE
>>>> [INFO]
>>>> ------------------------------------------------------------------------
>>>> [INFO] Total time: 9.939s
>>>> [INFO] Finished at: Tue Aug 26 19:23:14 HKT 2014
>>>> [INFO] Final Memory: 61M/2921M
>>>> [INFO]
>>>> ------------------------------------------------------------------------
>>>> [ERROR] Failed to execute goal on project hbase-server: Could not
>> resolve
>>>> dependencies for project
>> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2:
>>>> Failure to find org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
>>>> http://maven.oschina.net/content/groups/public/ was cached in the local
>>>> repository, resolution will not be reattempted until the update
>> interval of
>>>> nexus-osc has elapsed or updates are forced -> [Help 1]
>>>> [ERROR]
>>>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>>>> -e switch.
>>>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>>>> [ERROR]
>>>> [ERROR] For more information about the errors and possible solutions,
>>>> please read the following articles:
>>>> [ERROR] [Help 1]
>>>> 
>> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
>>>> [ERROR]
>>>> [ERROR] After correcting the problems, you can resume the build with the
>>>> command
>>>> [ERROR]   mvn <goals> -rf :hbase-server
>>>> 
>>>> 
>> 
>> 


Re: Compilation error: HBASE 0.98.4 with Snappy

Posted by Jean-Marc Spaggiari <je...@spaggiari.org>.
Hi Arthur,

Almost done! You now need to copy them on the HBase folder.

hbase@hbasetest1:~/hbase-0.98.2-hadoop2/lib$ tree | grep -v .jar | grep -v
.rb
.
├── native
│   └── Linux-x86
│       ├── libsnappy.a
│       ├── libsnappy.la
│       ├── libsnappy.so
│       ├── libsnappy.so.1
│       └── libsnappy.so.1.2.0

I don't have any hadoop-snappy lib in my hbase folder and it works very
well with Snappy for me...

JM

2014-08-26 8:09 GMT-04:00 Arthur.hk.chan@gmail.com <arthur.hk.chan@gmail.com
>:

> Hi JM,
>
> Below are my steps to install snappy lib, do I miss something?
>
> Regards
> Arthur
>
> wget https://snappy.googlecode.com/files/snappy-1.1.1.tar.gz
> tar -vxf snappy-1.1.1.tar.gz
> cd snappy-1.1.1
> ./configure
> make
> make install
>         make[1]: Entering directory `/edh/hadoop_all_sources/snappy-1.1.1'
>         test -z "/usr/local/lib" || /bin/mkdir -p "/usr/local/lib"
>          /bin/sh ./libtool   --mode=install /usr/bin/install -c
> libsnappy.la '/usr/local/lib'
>         libtool: install: /usr/bin/install -c .libs/libsnappy.so.1.2.0
> /usr/local/lib/libsnappy.so.1.2.0
>         libtool: install: (cd /usr/local/lib && { ln -s -f
> libsnappy.so.1.2.0 libsnappy.so.1 || { rm -f libsnappy.so.1 && ln -s
> libsnappy.so.1.2.0 libsnappy.so.1; }; })
>         libtool: install: (cd /usr/local/lib && { ln -s -f
> libsnappy.so.1.2.0 libsnappy.so || { rm -f libsnappy.so && ln -s
> libsnappy.so.1.2.0 libsnappy.so; }; })
>         libtool: install: /usr/bin/install -c .libs/libsnappy.lai
> /usr/local/lib/libsnappy.la
>         libtool: install: /usr/bin/install -c .libs/libsnappy.a
> /usr/local/lib/libsnappy.a
>         libtool: install: chmod 644 /usr/local/lib/libsnappy.a
>         libtool: install: ranlib /usr/local/lib/libsnappy.a
>         libtool: finish:
> PATH="/edh/hadoop/spark/bin:/edh/hadoop/hbase/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/yarn/hadoop/bin:/edh/hadoop/yarn/hadoop/sbin:/usr/lib64/qt-3.3/bin:/opt/apache-maven-3.1.1/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/hive//bin:/usr/lib/jvm/jdk1.6.0_45//bin:/root/bin:/sbin"
> ldconfig -n /usr/local/lib
>
> ----------------------------------------------------------------------
>         Libraries have been installed in:
>         /usr/local/lib
>         If you ever happen to want to link against installed libraries
>         in a given directory, LIBDIR, you must either use libtool, and
>         specify the full pathname of the library, or use the `-LLIBDIR'
>         flag during linking and do at least one of the following:
>         - add LIBDIR to the `LD_LIBRARY_PATH' environment variable
>         during execution
>         - add LIBDIR to the `LD_RUN_PATH' environment variable
>         during linking
>         - use the `-Wl,-rpath -Wl,LIBDIR' linker flag
>         - have your system administrator add LIBDIR to `/etc/ld.so.conf'
>         See any operating system documentation about shared libraries for
>         more information, such as the ld(1) and ld.so(8) manual pages.
>
> ----------------------------------------------------------------------
>         test -z "/usr/local/share/doc/snappy" || /bin/mkdir -p
> "/usr/local/share/doc/snappy"
>          /usr/bin/install -c -m 644 ChangeLog COPYING INSTALL NEWS README
> format_description.txt framing_format.txt '/usr/local/share/doc/snappy'
>         test -z "/usr/local/include" || /bin/mkdir -p "/usr/local/include"
>          /usr/bin/install -c -m 644 snappy.h snappy-sinksource.h
> snappy-stubs-public.h snappy-c.h '/usr/local/include'
>         make[1]: Leaving directory `/edh/hadoop_all_sources/snappy-1.1.1'
>
> ll /usr/local/lib
>         -rw-r--r--. 1 root root   233554 Aug 20 00:14 libsnappy.a
>         -rwxr-xr-x. 1 root root      953 Aug 20 00:14 libsnappy.la
>         lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so ->
> libsnappy.so.1.2.0
>         lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so.1 ->
> libsnappy.so.1.2.0
>         -rwxr-xr-x. 1 root root   147726 Aug 20 00:14 libsnappy.so.1.2.0
>
>
>
> On 26 Aug, 2014, at 7:38 pm, Jean-Marc Spaggiari <je...@spaggiari.org>
> wrote:
>
> > Hi Arthur,
> >
> > Do you have snappy libs installed and configured? HBase doesn't come with
> > Snappy. So yo need to have it first.
> >
> > Shameless plug:
> >
> http://www.spaggiari.org/index.php/hbase/how-to-install-snappy-with-1#.U_xxSqdZuZY
> >
> > This is for 0.96 but should be very similar for 0.98. I will try it soon
> > and post and update, but keep us posted here so we can support you...
> >
> > JM
> >
> >
> > 2014-08-26 7:34 GMT-04:00 Arthur.hk.chan@gmail.com <
> arthur.hk.chan@gmail.com
> >> :
> >
> >> Hi,
> >>
> >> I need to install snappy to HBase 0.98.4.  (my Hadoop version is 2.4.1)
> >>
> >> Can you please advise what would be wrong?  Should my pom.xml be
> incorrect
> >> and missing something?
> >>
> >> Regards
> >> Arthur
> >>
> >>
> >> Below are my commands:
> >> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 0.98.4-hadoop2
> >> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
> >> -Prelease,hadoop-snappy
> >>
> >> Iog:
> >> [INFO]
> >> ------------------------------------------------------------------------
> >> [INFO] Building HBase - Server 0.98.4-hadoop2
> >> [INFO]
> >> ------------------------------------------------------------------------
> >> [WARNING] The POM for org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT
> >> is missing, no dependency information available
> >> [INFO]
> >> ------------------------------------------------------------------------
> >> [INFO] Reactor Summary:
> >> [INFO]
> >> [INFO] HBase ............................................. SUCCESS
> [3.129s]
> >> [INFO] HBase - Common .................................... SUCCESS
> [3.105s]
> >> [INFO] HBase - Protocol .................................. SUCCESS
> [0.976s]
> >> [INFO] HBase - Client .................................... SUCCESS
> [0.925s]
> >> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS
> [0.183s]
> >> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS
> [0.497s]
> >> [INFO] HBase - Prefix Tree ............................... SUCCESS
> [0.407s]
> >> [INFO] HBase - Server .................................... FAILURE
> [0.103s]
> >> [INFO] HBase - Testing Util .............................. SKIPPED
> >> [INFO] HBase - Thrift .................................... SKIPPED
> >> [INFO] HBase - Shell ..................................... SKIPPED
> >> [INFO] HBase - Integration Tests ......................... SKIPPED
> >> [INFO] HBase - Examples .................................. SKIPPED
> >> [INFO] HBase - Assembly .................................. SKIPPED
> >> [INFO]
> >> ------------------------------------------------------------------------
> >> [INFO] BUILD FAILURE
> >> [INFO]
> >> ------------------------------------------------------------------------
> >> [INFO] Total time: 9.939s
> >> [INFO] Finished at: Tue Aug 26 19:23:14 HKT 2014
> >> [INFO] Final Memory: 61M/2921M
> >> [INFO]
> >> ------------------------------------------------------------------------
> >> [ERROR] Failed to execute goal on project hbase-server: Could not
> resolve
> >> dependencies for project
> org.apache.hbase:hbase-server:jar:0.98.4-hadoop2:
> >> Failure to find org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
> >> http://maven.oschina.net/content/groups/public/ was cached in the local
> >> repository, resolution will not be reattempted until the update
> interval of
> >> nexus-osc has elapsed or updates are forced -> [Help 1]
> >> [ERROR]
> >> [ERROR] To see the full stack trace of the errors, re-run Maven with the
> >> -e switch.
> >> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> >> [ERROR]
> >> [ERROR] For more information about the errors and possible solutions,
> >> please read the following articles:
> >> [ERROR] [Help 1]
> >>
> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
> >> [ERROR]
> >> [ERROR] After correcting the problems, you can resume the build with the
> >> command
> >> [ERROR]   mvn <goals> -rf :hbase-server
> >>
> >>
>
>

Re: Compilation error: HBASE 0.98.4 with Snappy

Posted by "Arthur.hk.chan@gmail.com" <ar...@gmail.com>.
Hi JM,

Below are my steps to install snappy lib, do I miss something? 

Regards
Arthur

wget https://snappy.googlecode.com/files/snappy-1.1.1.tar.gz
tar -vxf snappy-1.1.1.tar.gz
cd snappy-1.1.1
./configure
make
make install
	make[1]: Entering directory `/edh/hadoop_all_sources/snappy-1.1.1'
	test -z "/usr/local/lib" || /bin/mkdir -p "/usr/local/lib"
	 /bin/sh ./libtool   --mode=install /usr/bin/install -c   libsnappy.la '/usr/local/lib'
	libtool: install: /usr/bin/install -c .libs/libsnappy.so.1.2.0 /usr/local/lib/libsnappy.so.1.2.0
	libtool: install: (cd /usr/local/lib && { ln -s -f libsnappy.so.1.2.0 libsnappy.so.1 || { rm -f libsnappy.so.1 && ln -s libsnappy.so.1.2.0 libsnappy.so.1; }; })
	libtool: install: (cd /usr/local/lib && { ln -s -f libsnappy.so.1.2.0 libsnappy.so || { rm -f libsnappy.so && ln -s libsnappy.so.1.2.0 libsnappy.so; }; })
	libtool: install: /usr/bin/install -c .libs/libsnappy.lai /usr/local/lib/libsnappy.la
	libtool: install: /usr/bin/install -c .libs/libsnappy.a /usr/local/lib/libsnappy.a
	libtool: install: chmod 644 /usr/local/lib/libsnappy.a
	libtool: install: ranlib /usr/local/lib/libsnappy.a
	libtool: finish: PATH="/edh/hadoop/spark/bin:/edh/hadoop/hbase/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/yarn/hadoop/bin:/edh/hadoop/yarn/hadoop/sbin:/usr/lib64/qt-3.3/bin:/opt/apache-maven-3.1.1/bin:/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/edh/hadoop/zookeeper//bin:/edh/hadoop/hive//bin:/usr/lib/jvm/jdk1.6.0_45//bin:/root/bin:/sbin" ldconfig -n /usr/local/lib
	----------------------------------------------------------------------
	Libraries have been installed in:
   	/usr/local/lib
	If you ever happen to want to link against installed libraries
	in a given directory, LIBDIR, you must either use libtool, and
	specify the full pathname of the library, or use the `-LLIBDIR'
	flag during linking and do at least one of the following:
   	- add LIBDIR to the `LD_LIBRARY_PATH' environment variable
     	during execution
   	- add LIBDIR to the `LD_RUN_PATH' environment variable
     	during linking
   	- use the `-Wl,-rpath -Wl,LIBDIR' linker flag
   	- have your system administrator add LIBDIR to `/etc/ld.so.conf'
	See any operating system documentation about shared libraries for
	more information, such as the ld(1) and ld.so(8) manual pages.
	----------------------------------------------------------------------
	test -z "/usr/local/share/doc/snappy" || /bin/mkdir -p "/usr/local/share/doc/snappy"
	 /usr/bin/install -c -m 644 ChangeLog COPYING INSTALL NEWS README format_description.txt framing_format.txt '/usr/local/share/doc/snappy'
	test -z "/usr/local/include" || /bin/mkdir -p "/usr/local/include"
	 /usr/bin/install -c -m 644 snappy.h snappy-sinksource.h snappy-stubs-public.h snappy-c.h '/usr/local/include'
	make[1]: Leaving directory `/edh/hadoop_all_sources/snappy-1.1.1'

ll /usr/local/lib
	-rw-r--r--. 1 root root   233554 Aug 20 00:14 libsnappy.a
	-rwxr-xr-x. 1 root root      953 Aug 20 00:14 libsnappy.la
	lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so -> libsnappy.so.1.2.0
	lrwxrwxrwx. 1 root root       18 Aug 20 00:14 libsnappy.so.1 -> libsnappy.so.1.2.0
	-rwxr-xr-x. 1 root root   147726 Aug 20 00:14 libsnappy.so.1.2.0



On 26 Aug, 2014, at 7:38 pm, Jean-Marc Spaggiari <je...@spaggiari.org> wrote:

> Hi Arthur,
> 
> Do you have snappy libs installed and configured? HBase doesn't come with
> Snappy. So yo need to have it first.
> 
> Shameless plug:
> http://www.spaggiari.org/index.php/hbase/how-to-install-snappy-with-1#.U_xxSqdZuZY
> 
> This is for 0.96 but should be very similar for 0.98. I will try it soon
> and post and update, but keep us posted here so we can support you...
> 
> JM
> 
> 
> 2014-08-26 7:34 GMT-04:00 Arthur.hk.chan@gmail.com <arthur.hk.chan@gmail.com
>> :
> 
>> Hi,
>> 
>> I need to install snappy to HBase 0.98.4.  (my Hadoop version is 2.4.1)
>> 
>> Can you please advise what would be wrong?  Should my pom.xml be incorrect
>> and missing something?
>> 
>> Regards
>> Arthur
>> 
>> 
>> Below are my commands:
>> bash -x ./dev-support/generate-hadoopX-poms.sh 0.98.4 0.98.4-hadoop2
>> mvn -f pom.xml.hadoop2 install -DskipTests assembly:single
>> -Prelease,hadoop-snappy
>> 
>> Iog:
>> [INFO]
>> ------------------------------------------------------------------------
>> [INFO] Building HBase - Server 0.98.4-hadoop2
>> [INFO]
>> ------------------------------------------------------------------------
>> [WARNING] The POM for org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT
>> is missing, no dependency information available
>> [INFO]
>> ------------------------------------------------------------------------
>> [INFO] Reactor Summary:
>> [INFO]
>> [INFO] HBase ............................................. SUCCESS [3.129s]
>> [INFO] HBase - Common .................................... SUCCESS [3.105s]
>> [INFO] HBase - Protocol .................................. SUCCESS [0.976s]
>> [INFO] HBase - Client .................................... SUCCESS [0.925s]
>> [INFO] HBase - Hadoop Compatibility ...................... SUCCESS [0.183s]
>> [INFO] HBase - Hadoop Two Compatibility .................. SUCCESS [0.497s]
>> [INFO] HBase - Prefix Tree ............................... SUCCESS [0.407s]
>> [INFO] HBase - Server .................................... FAILURE [0.103s]
>> [INFO] HBase - Testing Util .............................. SKIPPED
>> [INFO] HBase - Thrift .................................... SKIPPED
>> [INFO] HBase - Shell ..................................... SKIPPED
>> [INFO] HBase - Integration Tests ......................... SKIPPED
>> [INFO] HBase - Examples .................................. SKIPPED
>> [INFO] HBase - Assembly .................................. SKIPPED
>> [INFO]
>> ------------------------------------------------------------------------
>> [INFO] BUILD FAILURE
>> [INFO]
>> ------------------------------------------------------------------------
>> [INFO] Total time: 9.939s
>> [INFO] Finished at: Tue Aug 26 19:23:14 HKT 2014
>> [INFO] Final Memory: 61M/2921M
>> [INFO]
>> ------------------------------------------------------------------------
>> [ERROR] Failed to execute goal on project hbase-server: Could not resolve
>> dependencies for project org.apache.hbase:hbase-server:jar:0.98.4-hadoop2:
>> Failure to find org.apache.hadoop:hadoop-snappy:jar:0.0.1-SNAPSHOT in
>> http://maven.oschina.net/content/groups/public/ was cached in the local
>> repository, resolution will not be reattempted until the update interval of
>> nexus-osc has elapsed or updates are forced -> [Help 1]
>> [ERROR]
>> [ERROR] To see the full stack trace of the errors, re-run Maven with the
>> -e switch.
>> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
>> [ERROR]
>> [ERROR] For more information about the errors and possible solutions,
>> please read the following articles:
>> [ERROR] [Help 1]
>> http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
>> [ERROR]
>> [ERROR] After correcting the problems, you can resume the build with the
>> command
>> [ERROR]   mvn <goals> -rf :hbase-server
>> 
>>