You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@phoenix.apache.org by Saba Varatharajaperumal <sa...@gmail.com> on 2015/11/29 03:10:51 UTC

Enabling secondary index in HBase - Phoenix

Hi

I have a 3 node hbase (version 1.1.2) cluster on which I am trying to use
Phoenix.  I am using Phoenix version 4.5.2. I have copied
"phoenix-4.5.2-HBase-1.1-server.jar"  to hbase/lib folder and changed the
hbase-sites.xml as described in the phoenix web site. When I start the
hbase server, I get the below error,

The phoenix-4.5.2-HBase-1.1-server.jar file definitely under hbase lib
folder on all servers including master.

Can someone help me on this?

java.io.IOException: Cannot get log reader
at org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:344)
at org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:266)
at org.apache.hadoop.hbase.wal.WALSplitter.getReader(WALSplitter.java:839)
at org.apache.hadoop.hbase.wal.WALSplitter.getReader(WALSplitter.java:763)
at
org.apache.hadoop.hbase.wal.WALSplitter.splitLogFile(WALSplitter.java:297)
at
org.apache.hadoop.hbase.wal.WALSplitter.splitLogFile(WALSplitter.java:235)
at
org.apache.hadoop.hbase.regionserver.SplitLogWorker$1.exec(SplitLogWorker.java:104)
at
org.apache.hadoop.hbase.regionserver.handler.WALSplitterHandler.process(WALSplitterHandler.java:72)
at org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:128)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.UnsupportedOperationException: Unable to find
org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec
at
org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:36)
at
org.apache.hadoop.hbase.regionserver.wal.WALCellCodec.create(WALCellCodec.java:103)
at
org.apache.hadoop.hbase.regionserver.wal.ProtobufLogReader.getCodec(ProtobufLogReader.java:280)
at
org.apache.hadoop.hbase.regionserver.wal.ProtobufLogReader.initAfterCompression(ProtobufLogReader.java:290)
at
org.apache.hadoop.hbase.regionserver.wal.ReaderBase.init(ReaderBase.java:82)
at
org.apache.hadoop.hbase.regionserver.wal.ProtobufLogReader.init(ProtobufLogReader.java:147)
at org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:301)
... 11 more
Caused by: java.lang.ClassNotFoundException:
org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at
org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:32)
... 17 more


Thanks and appreciate your time

Saba

Re: Enabling secondary index in HBase - Phoenix

Posted by Saba Varatharajaperumal <sa...@gmail.com>.
Hi

I just wanted to provide an update on this. Some good news.

I couldn't figure out why there was a class not found exception even after
so many restart. When I was trying to create the Phoenix index, I had an
existing Phoenix table with data on them.

As an exercise I re-installed HBase and Phoenix and everything works fine.
There was something I read in comment that Phoenix index doesn't work when
WAL has existing data to be applied. May be that was the case for me. Not
sure.

Thanks
Saba

On Sun, Nov 29, 2015 at 2:35 PM, Saba Varatharajaperumal <
saba00021@gmail.com> wrote:

> Hi Jesse/James
>
> Thanks for the quick response. Appreciate it.
>
> The jar file definitely has the class. So I must be using the right
> "server" jar
>
> 1. /usr/local/hbase/lib$ jar -vtf phoenix-4.5.2-HBase-1.1-server.jar |
> grep 'org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec'
>   1879 Fri Sep 11 15:47:06 UTC 2015
> org/apache/hadoop/hbase/regionserver/wal/IndexedWALEditCodec$CompressedIndexKeyValueDecoder.class
>   2461 Fri Sep 11 15:47:06 UTC 2015
> org/apache/hadoop/hbase/regionserver/wal/IndexedWALEditCodec.class
>   1263 Fri Sep 11 15:47:06 UTC 2015
> org/apache/hadoop/hbase/regionserver/wal/IndexedWALEditCodec$IndexKeyValueDecoder.class
>   1827 Fri Sep 11 15:47:06 UTC 2015
> org/apache/hadoop/hbase/regionserver/wal/IndexedWALEditCodec$CompressedIndexKeyValueEncoder.class
>   1269 Fri Sep 11 15:47:06 UTC 2015
> org/apache/hadoop/hbase/regionserver/wal/IndexedWALEditCodec$IndexKeyValueEncoder.class
>
>
> 2. When I executed  hbase classpath on one of the region server and then
> grep the output for phoenix and it does have the same phoenix jar in the
> classpath,
> So I assume this is sufficient for the classloader to find the class,
> unless it is getting loaded from a customized class loader or something.
>
>
> hduser@reportingnode2:/usr/local/hbase/bin$ hbase classpath | grep phoenix
>
> /usr/local/hbase/conf:/usr/lib/jvm/jdk1.8.0_45/lib/tools.jar:/usr/local/hbase:/usr/local/hbase/lib/activation-1.1.jar:/usr/local/hbase/lib/aopalliance-1.0.jar:/usr/local/hbase/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hbase/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hbase/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hbase/lib/api-util-1.0.0-M20.jar:/usr/local/hbase/lib/asm-3.1.jar:/usr/local/hbase/lib/avro-1.7.4.jar:/usr/local/hbase/lib/commons-beanutils-1.7.0.jar:/usr/local/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hbase/lib/commons-cli-1.2.jar:/usr/local/hbase/lib/commons-codec-1.9.jar:/usr/local/hbase/lib/commons-collections-3.2.1.jar:/usr/local/hbase/lib/commons-compress-1.4.1.jar:/usr/local/hbase/lib/commons-configuration-1.6.jar:/usr/local/hbase/lib/commons-daemon-1.0.13.jar:/usr/local/hbase/lib/commons-digester-1.8.jar:/usr/local/hbase/lib/commons-el-1.0.jar:/usr/local/hbase/lib/commons-httpclient-3.1.jar:/usr/local/hbase/lib/commons-io-2.4.jar:/usr/local/hbase/lib/commons-lang-2.6.jar:/usr/local/hbase/lib/commons-logging-1.2.jar:/usr/local/hbase/lib/commons-math-2.2.jar:/usr/local/hbase/lib/commons-math3-3.1.1.jar:/usr/local/hbase/lib/commons-net-3.1.jar:/usr/local/hbase/lib/disruptor-3.3.0.jar:/usr/local/hbase/lib/findbugs-annotations-1.3.9-1.jar:/usr/local/hbase/lib/guava-12.0.1.jar:/usr/local/hbase/lib/guice-3.0.jar:/usr/local/hbase/lib/guice-servlet-3.0.jar:/usr/local/hbase/lib/hadoop-annotations-2.5.1.jar:/usr/local/hbase/lib/hadoop-auth-2.5.1.jar:/usr/local/hbase/lib/hadoop-client-2.5.1.jar:/usr/local/hbase/lib/hadoop-common-2.5.1.jar:/usr/local/hbase/lib/hadoop-hdfs-2.5.1.jar:/usr/local/hbase/lib/hadoop-mapreduce-client-app-2.5.1.jar:/usr/local/hbase/lib/hadoop-mapreduce-client-common-2.5.1.jar:/usr/local/hbase/lib/hadoop-mapreduce-client-core-2.5.1.jar:/usr/local/hbase/lib/hadoop-mapreduce-client-jobclient-2.5.1.jar:/usr/local/hbase/lib/hadoop-mapreduce-client-shuffle-2.5.1.jar:/usr/local/hbase/lib/hadoop-yarn-api-2.5.1.jar:/usr/local/hbase/lib/hadoop-yarn-client-2.5.1.jar:/usr/local/hbase/lib/hadoop-yarn-common-2.5.1.jar:/usr/local/hbase/lib/hadoop-yarn-server-common-2.5.1.jar:/usr/local/hbase/lib/hbase-annotations-1.1.2.jar:/usr/local/hbase/lib/hbase-annotations-1.1.2-tests.jar:/usr/local/hbase/lib/hbase-client-1.1.2.jar:/usr/local/hbase/lib/hbase-common-1.1.2.jar:/usr/local/hbase/lib/hbase-common-1.1.2-tests.jar:/usr/local/hbase/lib/hbase-examples-1.1.2.jar:/usr/local/hbase/lib/hbase-hadoop2-compat-1.1.2.jar:/usr/local/hbase/lib/hbase-hadoop-compat-1.1.2.jar:/usr/local/hbase/lib/hbase-it-1.1.2.jar:/usr/local/hbase/lib/hbase-it-1.1.2-tests.jar:/usr/local/hbase/lib/hbase-prefix-tree-1.1.2.jar:/usr/local/hbase/lib/hbase-procedure-1.1.2.jar:/usr/local/hbase/lib/hbase-protocol-1.1.2.jar:/usr/local/hbase/lib/hbase-resource-bundle-1.1.2.jar:/usr/local/hbase/lib/hbase-rest-1.1.2.jar:/usr/local/hbase/lib/hbase-server-1.1.2.jar:/usr/local/hbase/lib/hbase-server-1.1.2-tests.jar:/usr/local/hbase/lib/hbase-shell-1.1.2.jar:/usr/local/hbase/lib/hbase-thrift-1.1.2.jar:/usr/local/hbase/lib/htrace-core-3.1.0-incubating.jar:/usr/local/hbase/lib/httpclient-4.2.5.jar:/usr/local/hbase/lib/httpcore-4.1.3.jar:/usr/local/hbase/lib/jackson-core-asl-1.9.13.jar:/usr/local/hbase/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hbase/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hbase/lib/jackson-xc-1.9.13.jar:/usr/local/hbase/lib/jamon-runtime-2.3.1.jar:/usr/local/hbase/lib/jasper-compiler-5.5.23.jar:/usr/local/hbase/lib/jasper-runtime-5.5.23.jar:/usr/local/hbase/lib/javax.inject-1.jar:/usr/local/hbase/lib/java-xmlbuilder-0.4.jar:/usr/local/hbase/lib/jaxb-api-2.2.2.jar:/usr/local/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hbase/lib/jcodings-1.0.8.jar:/usr/local/hbase/lib/jersey-client-1.9.jar:/usr/local/hbase/lib/jersey-core-1.9.jar:/usr/local/hbase/lib/jersey-guice-1.9.jar:/usr/local/hbase/lib/jersey-json-1.9.jar:/usr/local/hbase/lib/jersey-server-1.9.jar:/usr/local/hbase/lib/jets3t-0.9.0.jar:/usr/local/hbase/lib/jettison-1.3.3.jar:/usr/local/hbase/lib/jetty-6.1.26.jar:/usr/local/hbase/lib/jetty-sslengine-6.1.26.jar:/usr/local/hbase/lib/jetty-util-6.1.26.jar:/usr/local/hbase/lib/joni-2.1.2.jar:/usr/local/hbase/lib/jruby-complete-1.6.8.jar:/usr/local/hbase/lib/jsch-0.1.42.jar:/usr/local/hbase/lib/jsp-2.1-6.1.14.jar:/usr/local/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/local/hbase/lib/jsr305-1.3.9.jar:/usr/local/hbase/lib/junit-4.11.jar:/usr/local/hbase/lib/leveldbjni-all-1.8.jar:/usr/local/hbase/lib/libthrift-0.9.0.jar:/usr/local/hbase/lib/log4j-1.2.17.jar:/usr/local/hbase/lib/metrics-core-2.2.0.jar:/usr/local/hbase/lib/netty-3.2.4.Final.jar:/usr/local/hbase/lib/netty-all-4.0.23.Final.jar:/usr/local/hbase/lib/paranamer-2.3.jar:
> */usr/local/hbase/lib/phoenix-4.5.2-HBase-1.1-server.jar:*
> /usr/local/hbase/lib/protobuf-java-2.5.0.jar:/usr/local/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/local/hbase/lib/servlet-api-2.5.jar:/usr/local/hbase/lib/slf4j-api-1.7.7.jar:/usr/local/hbase/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hbase/lib/snappy-java-1.0.4.1.jar:/usr/local/hbase/lib/spymemcached-2.11.6.jar:/usr/local/hbase/lib/xmlenc-0.52.jar:/usr/local/hbase/lib/xz-1.0.jar:/usr/local/hbase/lib/zookeeper-3.4.6.jar:/usr/local/hadoop-2.7.1/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/*:/usr/local/hadoop/share/hadoop/common/*:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/*:/usr/local/hadoop/share/hadoop/hdfs/*:/usr/local/hadoop-2.7.1/share/hadoop/yarn/lib/*:/usr/local/hadoop-2.7.1/share/hadoop/yarn/*:/usr/local/hadoop/share/hadoop/mapreduce/lib/*:/usr/local/hadoop/share/hadoop/mapreduce/*:/usr/local/hadoop/contrib/capacity-scheduler/*.jar
>
>
> Cheers
> Saba
>
> On Sun, Nov 29, 2015 at 1:33 PM, James Taylor <ja...@apache.org>
> wrote:
>
>> Make sure it's the right jar too: there are two with the word "server" in
>> them.
>>
>>
>> On Saturday, November 28, 2015, Jesse Yates <je...@gmail.com>
>> wrote:
>>
>>> I think with that version of Phoenix you should have that class.
>>>
>>> 1. Can you grep the jar contents and ensure the class
>>> (IndexedWALEditCodec) is there?
>>> 2. Can you check the hbase classpath to ensure the jar is getting picked
>>> up? (bin/hbase classpath)
>>>
>>> On Sat, Nov 28, 2015, 6:10 PM Saba Varatharajaperumal <
>>> saba00021@gmail.com> wrote:
>>>
>>>> Hi
>>>>
>>>> I have a 3 node hbase (version 1.1.2) cluster on which I am trying to
>>>> use Phoenix.  I am using Phoenix version 4.5.2. I have copied
>>>> "phoenix-4.5.2-HBase-1.1-server.jar"  to hbase/lib folder and changed the
>>>> hbase-sites.xml as described in the phoenix web site. When I start the
>>>> hbase server, I get the below error,
>>>>
>>>> The phoenix-4.5.2-HBase-1.1-server.jar file definitely under hbase lib
>>>> folder on all servers including master.
>>>>
>>>> Can someone help me on this?
>>>>
>>>> java.io.IOException: Cannot get log reader
>>>> at
>>>> org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:344)
>>>> at
>>>> org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:266)
>>>> at
>>>> org.apache.hadoop.hbase.wal.WALSplitter.getReader(WALSplitter.java:839)
>>>> at
>>>> org.apache.hadoop.hbase.wal.WALSplitter.getReader(WALSplitter.java:763)
>>>> at
>>>> org.apache.hadoop.hbase.wal.WALSplitter.splitLogFile(WALSplitter.java:297)
>>>> at
>>>> org.apache.hadoop.hbase.wal.WALSplitter.splitLogFile(WALSplitter.java:235)
>>>> at
>>>> org.apache.hadoop.hbase.regionserver.SplitLogWorker$1.exec(SplitLogWorker.java:104)
>>>> at
>>>> org.apache.hadoop.hbase.regionserver.handler.WALSplitterHandler.process(WALSplitterHandler.java:72)
>>>> at
>>>> org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:128)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>>> at
>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>> Caused by: java.lang.UnsupportedOperationException: Unable to find
>>>> org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec
>>>> at
>>>> org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:36)
>>>> at
>>>> org.apache.hadoop.hbase.regionserver.wal.WALCellCodec.create(WALCellCodec.java:103)
>>>> at
>>>> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogReader.getCodec(ProtobufLogReader.java:280)
>>>> at
>>>> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogReader.initAfterCompression(ProtobufLogReader.java:290)
>>>> at
>>>> org.apache.hadoop.hbase.regionserver.wal.ReaderBase.init(ReaderBase.java:82)
>>>> at
>>>> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogReader.init(ProtobufLogReader.java:147)
>>>> at
>>>> org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:301)
>>>> ... 11 more
>>>> Caused by: java.lang.ClassNotFoundException:
>>>> org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec
>>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>> at java.lang.Class.forName0(Native Method)
>>>> at java.lang.Class.forName(Class.java:264)
>>>> at
>>>> org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:32)
>>>> ... 17 more
>>>>
>>>>
>>>> Thanks and appreciate your time
>>>>
>>>> Saba
>>>>
>>>>
>

Re: Enabling secondary index in HBase - Phoenix

Posted by Saba Varatharajaperumal <sa...@gmail.com>.
Hi Jesse/James

Thanks for the quick response. Appreciate it.

The jar file definitely has the class. So I must be using the right
"server" jar

1. /usr/local/hbase/lib$ jar -vtf phoenix-4.5.2-HBase-1.1-server.jar | grep
'org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec'
  1879 Fri Sep 11 15:47:06 UTC 2015
org/apache/hadoop/hbase/regionserver/wal/IndexedWALEditCodec$CompressedIndexKeyValueDecoder.class
  2461 Fri Sep 11 15:47:06 UTC 2015
org/apache/hadoop/hbase/regionserver/wal/IndexedWALEditCodec.class
  1263 Fri Sep 11 15:47:06 UTC 2015
org/apache/hadoop/hbase/regionserver/wal/IndexedWALEditCodec$IndexKeyValueDecoder.class
  1827 Fri Sep 11 15:47:06 UTC 2015
org/apache/hadoop/hbase/regionserver/wal/IndexedWALEditCodec$CompressedIndexKeyValueEncoder.class
  1269 Fri Sep 11 15:47:06 UTC 2015
org/apache/hadoop/hbase/regionserver/wal/IndexedWALEditCodec$IndexKeyValueEncoder.class


2. When I executed  hbase classpath on one of the region server and then
grep the output for phoenix and it does have the same phoenix jar in the
classpath,
So I assume this is sufficient for the classloader to find the class,
unless it is getting loaded from a customized class loader or something.


hduser@reportingnode2:/usr/local/hbase/bin$ hbase classpath | grep phoenix
/usr/local/hbase/conf:/usr/lib/jvm/jdk1.8.0_45/lib/tools.jar:/usr/local/hbase:/usr/local/hbase/lib/activation-1.1.jar:/usr/local/hbase/lib/aopalliance-1.0.jar:/usr/local/hbase/lib/apacheds-i18n-2.0.0-M15.jar:/usr/local/hbase/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/usr/local/hbase/lib/api-asn1-api-1.0.0-M20.jar:/usr/local/hbase/lib/api-util-1.0.0-M20.jar:/usr/local/hbase/lib/asm-3.1.jar:/usr/local/hbase/lib/avro-1.7.4.jar:/usr/local/hbase/lib/commons-beanutils-1.7.0.jar:/usr/local/hbase/lib/commons-beanutils-core-1.8.0.jar:/usr/local/hbase/lib/commons-cli-1.2.jar:/usr/local/hbase/lib/commons-codec-1.9.jar:/usr/local/hbase/lib/commons-collections-3.2.1.jar:/usr/local/hbase/lib/commons-compress-1.4.1.jar:/usr/local/hbase/lib/commons-configuration-1.6.jar:/usr/local/hbase/lib/commons-daemon-1.0.13.jar:/usr/local/hbase/lib/commons-digester-1.8.jar:/usr/local/hbase/lib/commons-el-1.0.jar:/usr/local/hbase/lib/commons-httpclient-3.1.jar:/usr/local/hbase/lib/commons-io-2.4.jar:/usr/local/hbase/lib/commons-lang-2.6.jar:/usr/local/hbase/lib/commons-logging-1.2.jar:/usr/local/hbase/lib/commons-math-2.2.jar:/usr/local/hbase/lib/commons-math3-3.1.1.jar:/usr/local/hbase/lib/commons-net-3.1.jar:/usr/local/hbase/lib/disruptor-3.3.0.jar:/usr/local/hbase/lib/findbugs-annotations-1.3.9-1.jar:/usr/local/hbase/lib/guava-12.0.1.jar:/usr/local/hbase/lib/guice-3.0.jar:/usr/local/hbase/lib/guice-servlet-3.0.jar:/usr/local/hbase/lib/hadoop-annotations-2.5.1.jar:/usr/local/hbase/lib/hadoop-auth-2.5.1.jar:/usr/local/hbase/lib/hadoop-client-2.5.1.jar:/usr/local/hbase/lib/hadoop-common-2.5.1.jar:/usr/local/hbase/lib/hadoop-hdfs-2.5.1.jar:/usr/local/hbase/lib/hadoop-mapreduce-client-app-2.5.1.jar:/usr/local/hbase/lib/hadoop-mapreduce-client-common-2.5.1.jar:/usr/local/hbase/lib/hadoop-mapreduce-client-core-2.5.1.jar:/usr/local/hbase/lib/hadoop-mapreduce-client-jobclient-2.5.1.jar:/usr/local/hbase/lib/hadoop-mapreduce-client-shuffle-2.5.1.jar:/usr/local/hbase/lib/hadoop-yarn-api-2.5.1.jar:/usr/local/hbase/lib/hadoop-yarn-client-2.5.1.jar:/usr/local/hbase/lib/hadoop-yarn-common-2.5.1.jar:/usr/local/hbase/lib/hadoop-yarn-server-common-2.5.1.jar:/usr/local/hbase/lib/hbase-annotations-1.1.2.jar:/usr/local/hbase/lib/hbase-annotations-1.1.2-tests.jar:/usr/local/hbase/lib/hbase-client-1.1.2.jar:/usr/local/hbase/lib/hbase-common-1.1.2.jar:/usr/local/hbase/lib/hbase-common-1.1.2-tests.jar:/usr/local/hbase/lib/hbase-examples-1.1.2.jar:/usr/local/hbase/lib/hbase-hadoop2-compat-1.1.2.jar:/usr/local/hbase/lib/hbase-hadoop-compat-1.1.2.jar:/usr/local/hbase/lib/hbase-it-1.1.2.jar:/usr/local/hbase/lib/hbase-it-1.1.2-tests.jar:/usr/local/hbase/lib/hbase-prefix-tree-1.1.2.jar:/usr/local/hbase/lib/hbase-procedure-1.1.2.jar:/usr/local/hbase/lib/hbase-protocol-1.1.2.jar:/usr/local/hbase/lib/hbase-resource-bundle-1.1.2.jar:/usr/local/hbase/lib/hbase-rest-1.1.2.jar:/usr/local/hbase/lib/hbase-server-1.1.2.jar:/usr/local/hbase/lib/hbase-server-1.1.2-tests.jar:/usr/local/hbase/lib/hbase-shell-1.1.2.jar:/usr/local/hbase/lib/hbase-thrift-1.1.2.jar:/usr/local/hbase/lib/htrace-core-3.1.0-incubating.jar:/usr/local/hbase/lib/httpclient-4.2.5.jar:/usr/local/hbase/lib/httpcore-4.1.3.jar:/usr/local/hbase/lib/jackson-core-asl-1.9.13.jar:/usr/local/hbase/lib/jackson-jaxrs-1.9.13.jar:/usr/local/hbase/lib/jackson-mapper-asl-1.9.13.jar:/usr/local/hbase/lib/jackson-xc-1.9.13.jar:/usr/local/hbase/lib/jamon-runtime-2.3.1.jar:/usr/local/hbase/lib/jasper-compiler-5.5.23.jar:/usr/local/hbase/lib/jasper-runtime-5.5.23.jar:/usr/local/hbase/lib/javax.inject-1.jar:/usr/local/hbase/lib/java-xmlbuilder-0.4.jar:/usr/local/hbase/lib/jaxb-api-2.2.2.jar:/usr/local/hbase/lib/jaxb-impl-2.2.3-1.jar:/usr/local/hbase/lib/jcodings-1.0.8.jar:/usr/local/hbase/lib/jersey-client-1.9.jar:/usr/local/hbase/lib/jersey-core-1.9.jar:/usr/local/hbase/lib/jersey-guice-1.9.jar:/usr/local/hbase/lib/jersey-json-1.9.jar:/usr/local/hbase/lib/jersey-server-1.9.jar:/usr/local/hbase/lib/jets3t-0.9.0.jar:/usr/local/hbase/lib/jettison-1.3.3.jar:/usr/local/hbase/lib/jetty-6.1.26.jar:/usr/local/hbase/lib/jetty-sslengine-6.1.26.jar:/usr/local/hbase/lib/jetty-util-6.1.26.jar:/usr/local/hbase/lib/joni-2.1.2.jar:/usr/local/hbase/lib/jruby-complete-1.6.8.jar:/usr/local/hbase/lib/jsch-0.1.42.jar:/usr/local/hbase/lib/jsp-2.1-6.1.14.jar:/usr/local/hbase/lib/jsp-api-2.1-6.1.14.jar:/usr/local/hbase/lib/jsr305-1.3.9.jar:/usr/local/hbase/lib/junit-4.11.jar:/usr/local/hbase/lib/leveldbjni-all-1.8.jar:/usr/local/hbase/lib/libthrift-0.9.0.jar:/usr/local/hbase/lib/log4j-1.2.17.jar:/usr/local/hbase/lib/metrics-core-2.2.0.jar:/usr/local/hbase/lib/netty-3.2.4.Final.jar:/usr/local/hbase/lib/netty-all-4.0.23.Final.jar:/usr/local/hbase/lib/paranamer-2.3.jar:
*/usr/local/hbase/lib/phoenix-4.5.2-HBase-1.1-server.jar:*
/usr/local/hbase/lib/protobuf-java-2.5.0.jar:/usr/local/hbase/lib/servlet-api-2.5-6.1.14.jar:/usr/local/hbase/lib/servlet-api-2.5.jar:/usr/local/hbase/lib/slf4j-api-1.7.7.jar:/usr/local/hbase/lib/slf4j-log4j12-1.7.5.jar:/usr/local/hbase/lib/snappy-java-1.0.4.1.jar:/usr/local/hbase/lib/spymemcached-2.11.6.jar:/usr/local/hbase/lib/xmlenc-0.52.jar:/usr/local/hbase/lib/xz-1.0.jar:/usr/local/hbase/lib/zookeeper-3.4.6.jar:/usr/local/hadoop-2.7.1/etc/hadoop:/usr/local/hadoop/share/hadoop/common/lib/*:/usr/local/hadoop/share/hadoop/common/*:/usr/local/hadoop/share/hadoop/hdfs:/usr/local/hadoop/share/hadoop/hdfs/lib/*:/usr/local/hadoop/share/hadoop/hdfs/*:/usr/local/hadoop-2.7.1/share/hadoop/yarn/lib/*:/usr/local/hadoop-2.7.1/share/hadoop/yarn/*:/usr/local/hadoop/share/hadoop/mapreduce/lib/*:/usr/local/hadoop/share/hadoop/mapreduce/*:/usr/local/hadoop/contrib/capacity-scheduler/*.jar


Cheers
Saba

On Sun, Nov 29, 2015 at 1:33 PM, James Taylor <ja...@apache.org>
wrote:

> Make sure it's the right jar too: there are two with the word "server" in
> them.
>
>
> On Saturday, November 28, 2015, Jesse Yates <je...@gmail.com>
> wrote:
>
>> I think with that version of Phoenix you should have that class.
>>
>> 1. Can you grep the jar contents and ensure the class
>> (IndexedWALEditCodec) is there?
>> 2. Can you check the hbase classpath to ensure the jar is getting picked
>> up? (bin/hbase classpath)
>>
>> On Sat, Nov 28, 2015, 6:10 PM Saba Varatharajaperumal <
>> saba00021@gmail.com> wrote:
>>
>>> Hi
>>>
>>> I have a 3 node hbase (version 1.1.2) cluster on which I am trying to
>>> use Phoenix.  I am using Phoenix version 4.5.2. I have copied
>>> "phoenix-4.5.2-HBase-1.1-server.jar"  to hbase/lib folder and changed the
>>> hbase-sites.xml as described in the phoenix web site. When I start the
>>> hbase server, I get the below error,
>>>
>>> The phoenix-4.5.2-HBase-1.1-server.jar file definitely under hbase lib
>>> folder on all servers including master.
>>>
>>> Can someone help me on this?
>>>
>>> java.io.IOException: Cannot get log reader
>>> at
>>> org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:344)
>>> at
>>> org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:266)
>>> at
>>> org.apache.hadoop.hbase.wal.WALSplitter.getReader(WALSplitter.java:839)
>>> at
>>> org.apache.hadoop.hbase.wal.WALSplitter.getReader(WALSplitter.java:763)
>>> at
>>> org.apache.hadoop.hbase.wal.WALSplitter.splitLogFile(WALSplitter.java:297)
>>> at
>>> org.apache.hadoop.hbase.wal.WALSplitter.splitLogFile(WALSplitter.java:235)
>>> at
>>> org.apache.hadoop.hbase.regionserver.SplitLogWorker$1.exec(SplitLogWorker.java:104)
>>> at
>>> org.apache.hadoop.hbase.regionserver.handler.WALSplitterHandler.process(WALSplitterHandler.java:72)
>>> at
>>> org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:128)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>> at
>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>> at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.lang.UnsupportedOperationException: Unable to find
>>> org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec
>>> at
>>> org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:36)
>>> at
>>> org.apache.hadoop.hbase.regionserver.wal.WALCellCodec.create(WALCellCodec.java:103)
>>> at
>>> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogReader.getCodec(ProtobufLogReader.java:280)
>>> at
>>> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogReader.initAfterCompression(ProtobufLogReader.java:290)
>>> at
>>> org.apache.hadoop.hbase.regionserver.wal.ReaderBase.init(ReaderBase.java:82)
>>> at
>>> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogReader.init(ProtobufLogReader.java:147)
>>> at
>>> org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:301)
>>> ... 11 more
>>> Caused by: java.lang.ClassNotFoundException:
>>> org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec
>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>> at java.lang.Class.forName0(Native Method)
>>> at java.lang.Class.forName(Class.java:264)
>>> at
>>> org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:32)
>>> ... 17 more
>>>
>>>
>>> Thanks and appreciate your time
>>>
>>> Saba
>>>
>>>

Re: Enabling secondary index in HBase - Phoenix

Posted by James Taylor <ja...@apache.org>.
Make sure it's the right jar too: there are two with the word "server" in
them.

On Saturday, November 28, 2015, Jesse Yates <je...@gmail.com> wrote:

> I think with that version of Phoenix you should have that class.
>
> 1. Can you grep the jar contents and ensure the class
> (IndexedWALEditCodec) is there?
> 2. Can you check the hbase classpath to ensure the jar is getting picked
> up? (bin/hbase classpath)
>
> On Sat, Nov 28, 2015, 6:10 PM Saba Varatharajaperumal <saba00021@gmail.com
> <javascript:_e(%7B%7D,'cvml','saba00021@gmail.com');>> wrote:
>
>> Hi
>>
>> I have a 3 node hbase (version 1.1.2) cluster on which I am trying to use
>> Phoenix.  I am using Phoenix version 4.5.2. I have copied
>> "phoenix-4.5.2-HBase-1.1-server.jar"  to hbase/lib folder and changed the
>> hbase-sites.xml as described in the phoenix web site. When I start the
>> hbase server, I get the below error,
>>
>> The phoenix-4.5.2-HBase-1.1-server.jar file definitely under hbase lib
>> folder on all servers including master.
>>
>> Can someone help me on this?
>>
>> java.io.IOException: Cannot get log reader
>> at
>> org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:344)
>> at
>> org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:266)
>> at org.apache.hadoop.hbase.wal.WALSplitter.getReader(WALSplitter.java:839)
>> at org.apache.hadoop.hbase.wal.WALSplitter.getReader(WALSplitter.java:763)
>> at
>> org.apache.hadoop.hbase.wal.WALSplitter.splitLogFile(WALSplitter.java:297)
>> at
>> org.apache.hadoop.hbase.wal.WALSplitter.splitLogFile(WALSplitter.java:235)
>> at
>> org.apache.hadoop.hbase.regionserver.SplitLogWorker$1.exec(SplitLogWorker.java:104)
>> at
>> org.apache.hadoop.hbase.regionserver.handler.WALSplitterHandler.process(WALSplitterHandler.java:72)
>> at
>> org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:128)
>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by: java.lang.UnsupportedOperationException: Unable to find
>> org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec
>> at
>> org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:36)
>> at
>> org.apache.hadoop.hbase.regionserver.wal.WALCellCodec.create(WALCellCodec.java:103)
>> at
>> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogReader.getCodec(ProtobufLogReader.java:280)
>> at
>> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogReader.initAfterCompression(ProtobufLogReader.java:290)
>> at
>> org.apache.hadoop.hbase.regionserver.wal.ReaderBase.init(ReaderBase.java:82)
>> at
>> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogReader.init(ProtobufLogReader.java:147)
>> at
>> org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:301)
>> ... 11 more
>> Caused by: java.lang.ClassNotFoundException:
>> org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec
>> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>> at java.lang.Class.forName0(Native Method)
>> at java.lang.Class.forName(Class.java:264)
>> at
>> org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:32)
>> ... 17 more
>>
>>
>> Thanks and appreciate your time
>>
>> Saba
>>
>>

Re: Enabling secondary index in HBase - Phoenix

Posted by Jesse Yates <je...@gmail.com>.
I think with that version of Phoenix you should have that class.

1. Can you grep the jar contents and ensure the class (IndexedWALEditCodec)
is there?
2. Can you check the hbase classpath to ensure the jar is getting picked
up? (bin/hbase classpath)

On Sat, Nov 28, 2015, 6:10 PM Saba Varatharajaperumal <sa...@gmail.com>
wrote:

> Hi
>
> I have a 3 node hbase (version 1.1.2) cluster on which I am trying to use
> Phoenix.  I am using Phoenix version 4.5.2. I have copied
> "phoenix-4.5.2-HBase-1.1-server.jar"  to hbase/lib folder and changed the
> hbase-sites.xml as described in the phoenix web site. When I start the
> hbase server, I get the below error,
>
> The phoenix-4.5.2-HBase-1.1-server.jar file definitely under hbase lib
> folder on all servers including master.
>
> Can someone help me on this?
>
> java.io.IOException: Cannot get log reader
> at org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:344)
> at org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:266)
> at org.apache.hadoop.hbase.wal.WALSplitter.getReader(WALSplitter.java:839)
> at org.apache.hadoop.hbase.wal.WALSplitter.getReader(WALSplitter.java:763)
> at
> org.apache.hadoop.hbase.wal.WALSplitter.splitLogFile(WALSplitter.java:297)
> at
> org.apache.hadoop.hbase.wal.WALSplitter.splitLogFile(WALSplitter.java:235)
> at
> org.apache.hadoop.hbase.regionserver.SplitLogWorker$1.exec(SplitLogWorker.java:104)
> at
> org.apache.hadoop.hbase.regionserver.handler.WALSplitterHandler.process(WALSplitterHandler.java:72)
> at org.apache.hadoop.hbase.executor.EventHandler.run(EventHandler.java:128)
> at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.UnsupportedOperationException: Unable to find
> org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec
> at
> org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:36)
> at
> org.apache.hadoop.hbase.regionserver.wal.WALCellCodec.create(WALCellCodec.java:103)
> at
> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogReader.getCodec(ProtobufLogReader.java:280)
> at
> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogReader.initAfterCompression(ProtobufLogReader.java:290)
> at
> org.apache.hadoop.hbase.regionserver.wal.ReaderBase.init(ReaderBase.java:82)
> at
> org.apache.hadoop.hbase.regionserver.wal.ProtobufLogReader.init(ProtobufLogReader.java:147)
> at org.apache.hadoop.hbase.wal.WALFactory.createReader(WALFactory.java:301)
> ... 11 more
> Caused by: java.lang.ClassNotFoundException:
> org.apache.hadoop.hbase.regionserver.wal.IndexedWALEditCodec
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Class.java:264)
> at
> org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:32)
> ... 17 more
>
>
> Thanks and appreciate your time
>
> Saba
>
>