You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@phoenix.apache.org by Vikas Agarwal <vi...@infoobjects.com> on 2014/09/01 12:43:30 UTC

Issue in connecting with HBase using Hortonworks

Hi,

We have installed Hadoop cluster using Hortonworks distribution and trying
to connect the Phoenix with HBase. However, even after following the steps
mentioned here
<http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.1.3/bk_installing_manually_book/content/rpm-chap-phoenix.html>,
we are not able to connect Phoenix to HBase.

When I am trying to test the connection using sqlline.py, the command hangs
the control and I am even not able to do Ctrl-C or Ctrl-Z. After sometime
(around 10 min) it throws same exception as with psql.py, complaining
mismatch of phoenix jars.

When I am trying to test the connection using Phoenix's psql.py command,
following set of exceptions are coming:

14/09/01 05:41:25 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
14/09/01 05:41:26 WARN util.DynamicClassLoader: Failed to identify the fs
of dir hdfs://hdp.ambari:8020/apps/hbase/data/lib, ignored
java.io.IOException: No FileSystem for scheme: hdfs
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2385)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2392)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
at
org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(DynamicClassLoader.java:104)
at
org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:217)
at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64)
at
org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:75)
at
org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:86)
at
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:853)
at
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:657)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at
org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:419)
at
org.apache.hadoop.hbase.client.ConnectionManager.createConnectionInternal(ConnectionManager.java:314)
at
org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:291)
at
org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
at
org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:252)
at
org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1447)
at
org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
at
org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:187)
at org.apache.phoenix.util.PhoenixRuntime.main(PhoenixRuntime.java:197)

java.sql.SQLException: ERROR 2006 (INT08): *Incompatible jars detected
between client and server. Ensure that phoenix.jar is put on the classpath
of HBase in every region server*: ERROR 1102 (XCL02): Cannot get all table
regions
at
org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:309)
at
org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:133)
at
org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:932)
at
org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:831)
at
org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1058)
at
org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1156)
at
org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
at
org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
at
org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:226)
at
org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:908)
at
org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1453)
at
org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
at
org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:187)
at org.apache.phoenix.util.PhoenixRuntime.main(PhoenixRuntime.java:197)
Caused by: java.sql.SQLException: ERROR 1102 (XCL02): Cannot get all table
regions
at
org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:309)
at
org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:133)
at
org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllTableRegions(ConnectionQueryServicesImpl.java:425)
at
org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:887)
... 13 more
Caused by: org.apache.hadoop.hbase.client.NoServerForRegionException: No
server address listed in hbase:meta for region
SYSTEM.CATALOG,,1408684006641.0d1ea455127dd4af6a806574b1f42a91. containing
row
at
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1334)
at
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1128)
at
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.relocateRegion(ConnectionManager.java:1097)
at
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.relocateRegion(ConnectionManager.java:1084)
at
org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getRegionLocation(ConnectionManager.java:904)
at
org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllTableRegions(ConnectionQueryServicesImpl.java:411)
... 14 more

-- 
Regards,
Vikas Agarwal
91 – 9928301411

InfoObjects, Inc.
Execution Matters
http://www.infoobjects.com
2041 Mission College Boulevard, #280
Santa Clara, CA 95054
+1 (408) 988-2000 Work
+1 (408) 716-2726 Fax

Re: Issue in connecting with HBase using Hortonworks

Posted by Vikas Agarwal <vi...@infoobjects.com>.
Yes, that is what I understood from docs. I thought I missed something
because James said it is inbuilt with HDP 2.1 now. Now, I got it. It is in
HDP 2.1's repository but not in ambari. :)


On Tue, Sep 2, 2014 at 5:29 AM, Devaraj Das <dd...@hortonworks.com> wrote:

> Vikas,
> Phoenix installation is not supported via Ambari yet. Please follow
> the instructions on the page you mention to use Phoenix with HDP..
> Thanks
> Devaraj
>
> On Mon, Sep 1, 2014 at 10:33 AM, Vikas Agarwal <vi...@infoobjects.com>
> wrote:
> > I didn't see Phoenix as option anywhere in Ambari and I found this
> >
> http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.1.3/bk_installing_manually_book/content/rpm-chap-phoenix.html
> >
> > Did I miss something in Ambari for Phoenix?
> >
> >
> > On Mon, Sep 1, 2014 at 11:01 PM, James Taylor <ja...@apache.org>
> > wrote:
> >>
> >> Hi Vikas,
> >> Glad you got it working. Just curious - why did you install Phoenix
> >> via yum when the HDP 2.1 already comes pre-installed with Phoenix?
> >> Thanks,
> >> James
> >>
> >> On Mon, Sep 1, 2014 at 10:16 AM, Vikas Agarwal <vi...@infoobjects.com>
> >> wrote:
> >> > Yes, I am using HDP 2.1 and installed Phoenix via yum and it installed
> >> > 4.0.0
> >> > of phoenix. Added symlink to the phoenix-core*jar into
> >> > /usr/lib/hbase/lib
> >> > and I did restart after these changes.
> >> >
> >> > However, now, I am able to connect with HBase. Somehow, SYSTEM.CATALOG
> >> > was
> >> > got created in one of my earlier attempts to connect phoenix to HBase
> >> > and it
> >> > didn't contained the required region server information for phoenix to
> >> > work
> >> > on. So, I deleted this table and hope phoenix to recreate it and it
> did
> >> > and
> >> > connection also worked after that. :)
> >> >
> >> >
> >> > On Mon, Sep 1, 2014 at 6:54 PM, Nicolas Maillard
> >> > <nm...@hortonworks.com>
> >> > wrote:
> >> >>
> >> >> Hello
> >> >> Just to be sure you are using HDP 2.1 and you gotten the phoenix jars
> >> >> from
> >> >> the page you listed and have put the phoenix jar in all on the hbase
> >> >> nodes
> >> >> in the lib directory and restarted the whole hbase service.
> >> >> If so could you also paste the line you use to start sqlline.
> >> >>
> >> >>
> >> >> On Mon, Sep 1, 2014 at 12:43 PM, Vikas Agarwal <
> vikas@infoobjects.com>
> >> >> wrote:
> >> >>>
> >> >>> Hi,
> >> >>>
> >> >>> We have installed Hadoop cluster using Hortonworks distribution and
> >> >>> trying to connect the Phoenix with HBase. However, even after
> >> >>> following the
> >> >>> steps mentioned here, we are not able to connect Phoenix to HBase.
> >> >>>
> >> >>> When I am trying to test the connection using sqlline.py, the
> command
> >> >>> hangs the control and I am even not able to do Ctrl-C or Ctrl-Z.
> After
> >> >>> sometime (around 10 min) it throws same exception as with psql.py,
> >> >>> complaining mismatch of phoenix jars.
> >> >>>
> >> >>> When I am trying to test the connection using Phoenix's psql.py
> >> >>> command,
> >> >>> following set of exceptions are coming:
> >> >>>
> >> >>> 14/09/01 05:41:25 WARN util.NativeCodeLoader: Unable to load
> >> >>> native-hadoop library for your platform... using builtin-java
> classes
> >> >>> where
> >> >>> applicable
> >> >>> 14/09/01 05:41:26 WARN util.DynamicClassLoader: Failed to identify
> the
> >> >>> fs
> >> >>> of dir hdfs://hdp.ambari:8020/apps/hbase/data/lib, ignored
> >> >>> java.io.IOException: No FileSystem for scheme: hdfs
> >> >>> at
> >> >>>
> >> >>>
> org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2385)
> >> >>> at
> >> >>>
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2392)
> >> >>> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431)
> >> >>> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413)
> >> >>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
> >> >>> at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(DynamicClassLoader.java:104)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:217)
> >> >>> at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:75)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:86)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:853)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:657)
> >> >>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> >> >>> Method)
> >> >>> at
> >> >>>
> >> >>>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> >> >>> at
> >> >>>
> >> >>>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> >> >>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:419)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.hadoop.hbase.client.ConnectionManager.createConnectionInternal(ConnectionManager.java:314)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:291)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:252)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1447)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
> >> >>> at java.sql.DriverManager.getConnection(DriverManager.java:571)
> >> >>> at java.sql.DriverManager.getConnection(DriverManager.java:187)
> >> >>> at
> >> >>> org.apache.phoenix.util.PhoenixRuntime.main(PhoenixRuntime.java:197)
> >> >>>
> >> >>> java.sql.SQLException: ERROR 2006 (INT08): Incompatible jars
> detected
> >> >>> between client and server. Ensure that phoenix.jar is put on the
> >> >>> classpath
> >> >>> of HBase in every region server: ERROR 1102 (XCL02): Cannot get all
> >> >>> table
> >> >>> regions
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:309)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:133)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:932)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:831)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1058)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1156)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:226)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:908)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1453)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
> >> >>> at java.sql.DriverManager.getConnection(DriverManager.java:571)
> >> >>> at java.sql.DriverManager.getConnection(DriverManager.java:187)
> >> >>> at
> >> >>> org.apache.phoenix.util.PhoenixRuntime.main(PhoenixRuntime.java:197)
> >> >>> Caused by: java.sql.SQLException: ERROR 1102 (XCL02): Cannot get all
> >> >>> table regions
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:309)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:133)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllTableRegions(ConnectionQueryServicesImpl.java:425)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:887)
> >> >>> ... 13 more
> >> >>> Caused by:
> org.apache.hadoop.hbase.client.NoServerForRegionException:
> >> >>> No
> >> >>> server address listed in hbase:meta for region
> >> >>> SYSTEM.CATALOG,,1408684006641.0d1ea455127dd4af6a806574b1f42a91.
> >> >>> containing
> >> >>> row
> >> >>> at
> >> >>>
> >> >>>
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1334)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1128)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.relocateRegion(ConnectionManager.java:1097)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.relocateRegion(ConnectionManager.java:1084)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getRegionLocation(ConnectionManager.java:904)
> >> >>> at
> >> >>>
> >> >>>
> org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllTableRegions(ConnectionQueryServicesImpl.java:411)
> >> >>> ... 14 more
> >> >>>
> >> >>> --
> >> >>> Regards,
> >> >>> Vikas Agarwal
> >> >>> 91 – 9928301411
> >> >>>
> >> >>> InfoObjects, Inc.
> >> >>> Execution Matters
> >> >>> http://www.infoobjects.com
> >> >>> 2041 Mission College Boulevard, #280
> >> >>> Santa Clara, CA 95054
> >> >>> +1 (408) 988-2000 Work
> >> >>> +1 (408) 716-2726 Fax
> >> >>
> >> >>
> >> >>
> >> >> CONFIDENTIALITY NOTICE
> >> >> NOTICE: This message is intended for the use of the individual or
> >> >> entity
> >> >> to which it is addressed and may contain information that is
> >> >> confidential,
> >> >> privileged and exempt from disclosure under applicable law. If the
> >> >> reader of
> >> >> this message is not the intended recipient, you are hereby notified
> >> >> that any
> >> >> printing, copying, dissemination, distribution, disclosure or
> >> >> forwarding of
> >> >> this communication is strictly prohibited. If you have received this
> >> >> communication in error, please contact the sender immediately and
> >> >> delete it
> >> >> from your system. Thank You.
> >> >
> >> >
> >> >
> >> >
> >> > --
> >> > Regards,
> >> > Vikas Agarwal
> >> > 91 – 9928301411
> >> >
> >> > InfoObjects, Inc.
> >> > Execution Matters
> >> > http://www.infoobjects.com
> >> > 2041 Mission College Boulevard, #280
> >> > Santa Clara, CA 95054
> >> > +1 (408) 988-2000 Work
> >> > +1 (408) 716-2726 Fax
> >
> >
> >
> >
> > --
> > Regards,
> > Vikas Agarwal
> > 91 – 9928301411
> >
> > InfoObjects, Inc.
> > Execution Matters
> > http://www.infoobjects.com
> > 2041 Mission College Boulevard, #280
> > Santa Clara, CA 95054
> > +1 (408) 988-2000 Work
> > +1 (408) 716-2726 Fax
>
> --
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity to
> which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.
>



-- 
Regards,
Vikas Agarwal
91 – 9928301411

InfoObjects, Inc.
Execution Matters
http://www.infoobjects.com
2041 Mission College Boulevard, #280
Santa Clara, CA 95054
+1 (408) 988-2000 Work
+1 (408) 716-2726 Fax

Re: Issue in connecting with HBase using Hortonworks

Posted by Devaraj Das <dd...@hortonworks.com>.
Vikas,
Phoenix installation is not supported via Ambari yet. Please follow
the instructions on the page you mention to use Phoenix with HDP..
Thanks
Devaraj

On Mon, Sep 1, 2014 at 10:33 AM, Vikas Agarwal <vi...@infoobjects.com> wrote:
> I didn't see Phoenix as option anywhere in Ambari and I found this
> http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.1.3/bk_installing_manually_book/content/rpm-chap-phoenix.html
>
> Did I miss something in Ambari for Phoenix?
>
>
> On Mon, Sep 1, 2014 at 11:01 PM, James Taylor <ja...@apache.org>
> wrote:
>>
>> Hi Vikas,
>> Glad you got it working. Just curious - why did you install Phoenix
>> via yum when the HDP 2.1 already comes pre-installed with Phoenix?
>> Thanks,
>> James
>>
>> On Mon, Sep 1, 2014 at 10:16 AM, Vikas Agarwal <vi...@infoobjects.com>
>> wrote:
>> > Yes, I am using HDP 2.1 and installed Phoenix via yum and it installed
>> > 4.0.0
>> > of phoenix. Added symlink to the phoenix-core*jar into
>> > /usr/lib/hbase/lib
>> > and I did restart after these changes.
>> >
>> > However, now, I am able to connect with HBase. Somehow, SYSTEM.CATALOG
>> > was
>> > got created in one of my earlier attempts to connect phoenix to HBase
>> > and it
>> > didn't contained the required region server information for phoenix to
>> > work
>> > on. So, I deleted this table and hope phoenix to recreate it and it did
>> > and
>> > connection also worked after that. :)
>> >
>> >
>> > On Mon, Sep 1, 2014 at 6:54 PM, Nicolas Maillard
>> > <nm...@hortonworks.com>
>> > wrote:
>> >>
>> >> Hello
>> >> Just to be sure you are using HDP 2.1 and you gotten the phoenix jars
>> >> from
>> >> the page you listed and have put the phoenix jar in all on the hbase
>> >> nodes
>> >> in the lib directory and restarted the whole hbase service.
>> >> If so could you also paste the line you use to start sqlline.
>> >>
>> >>
>> >> On Mon, Sep 1, 2014 at 12:43 PM, Vikas Agarwal <vi...@infoobjects.com>
>> >> wrote:
>> >>>
>> >>> Hi,
>> >>>
>> >>> We have installed Hadoop cluster using Hortonworks distribution and
>> >>> trying to connect the Phoenix with HBase. However, even after
>> >>> following the
>> >>> steps mentioned here, we are not able to connect Phoenix to HBase.
>> >>>
>> >>> When I am trying to test the connection using sqlline.py, the command
>> >>> hangs the control and I am even not able to do Ctrl-C or Ctrl-Z. After
>> >>> sometime (around 10 min) it throws same exception as with psql.py,
>> >>> complaining mismatch of phoenix jars.
>> >>>
>> >>> When I am trying to test the connection using Phoenix's psql.py
>> >>> command,
>> >>> following set of exceptions are coming:
>> >>>
>> >>> 14/09/01 05:41:25 WARN util.NativeCodeLoader: Unable to load
>> >>> native-hadoop library for your platform... using builtin-java classes
>> >>> where
>> >>> applicable
>> >>> 14/09/01 05:41:26 WARN util.DynamicClassLoader: Failed to identify the
>> >>> fs
>> >>> of dir hdfs://hdp.ambari:8020/apps/hbase/data/lib, ignored
>> >>> java.io.IOException: No FileSystem for scheme: hdfs
>> >>> at
>> >>>
>> >>> org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2385)
>> >>> at
>> >>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2392)
>> >>> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
>> >>> at
>> >>>
>> >>> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431)
>> >>> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413)
>> >>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
>> >>> at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
>> >>> at
>> >>>
>> >>> org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(DynamicClassLoader.java:104)
>> >>> at
>> >>>
>> >>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:217)
>> >>> at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64)
>> >>> at
>> >>>
>> >>> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:75)
>> >>> at
>> >>>
>> >>> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:86)
>> >>> at
>> >>>
>> >>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:853)
>> >>> at
>> >>>
>> >>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:657)
>> >>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
>> >>> Method)
>> >>> at
>> >>>
>> >>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>> >>> at
>> >>>
>> >>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>> >>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>> >>> at
>> >>>
>> >>> org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:419)
>> >>> at
>> >>>
>> >>> org.apache.hadoop.hbase.client.ConnectionManager.createConnectionInternal(ConnectionManager.java:314)
>> >>> at
>> >>>
>> >>> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:291)
>> >>> at
>> >>>
>> >>> org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
>> >>> at
>> >>>
>> >>> org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:252)
>> >>> at
>> >>>
>> >>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1447)
>> >>> at
>> >>>
>> >>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>> >>> at
>> >>>
>> >>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>> >>> at java.sql.DriverManager.getConnection(DriverManager.java:571)
>> >>> at java.sql.DriverManager.getConnection(DriverManager.java:187)
>> >>> at
>> >>> org.apache.phoenix.util.PhoenixRuntime.main(PhoenixRuntime.java:197)
>> >>>
>> >>> java.sql.SQLException: ERROR 2006 (INT08): Incompatible jars detected
>> >>> between client and server. Ensure that phoenix.jar is put on the
>> >>> classpath
>> >>> of HBase in every region server: ERROR 1102 (XCL02): Cannot get all
>> >>> table
>> >>> regions
>> >>> at
>> >>>
>> >>> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:309)
>> >>> at
>> >>>
>> >>> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:133)
>> >>> at
>> >>>
>> >>> org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:932)
>> >>> at
>> >>>
>> >>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:831)
>> >>> at
>> >>>
>> >>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1058)
>> >>> at
>> >>>
>> >>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1156)
>> >>> at
>> >>>
>> >>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>> >>> at
>> >>>
>> >>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>> >>> at
>> >>>
>> >>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:226)
>> >>> at
>> >>>
>> >>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:908)
>> >>> at
>> >>>
>> >>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1453)
>> >>> at
>> >>>
>> >>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>> >>> at
>> >>>
>> >>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>> >>> at java.sql.DriverManager.getConnection(DriverManager.java:571)
>> >>> at java.sql.DriverManager.getConnection(DriverManager.java:187)
>> >>> at
>> >>> org.apache.phoenix.util.PhoenixRuntime.main(PhoenixRuntime.java:197)
>> >>> Caused by: java.sql.SQLException: ERROR 1102 (XCL02): Cannot get all
>> >>> table regions
>> >>> at
>> >>>
>> >>> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:309)
>> >>> at
>> >>>
>> >>> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:133)
>> >>> at
>> >>>
>> >>> org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllTableRegions(ConnectionQueryServicesImpl.java:425)
>> >>> at
>> >>>
>> >>> org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:887)
>> >>> ... 13 more
>> >>> Caused by: org.apache.hadoop.hbase.client.NoServerForRegionException:
>> >>> No
>> >>> server address listed in hbase:meta for region
>> >>> SYSTEM.CATALOG,,1408684006641.0d1ea455127dd4af6a806574b1f42a91.
>> >>> containing
>> >>> row
>> >>> at
>> >>>
>> >>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1334)
>> >>> at
>> >>>
>> >>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1128)
>> >>> at
>> >>>
>> >>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.relocateRegion(ConnectionManager.java:1097)
>> >>> at
>> >>>
>> >>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.relocateRegion(ConnectionManager.java:1084)
>> >>> at
>> >>>
>> >>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getRegionLocation(ConnectionManager.java:904)
>> >>> at
>> >>>
>> >>> org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllTableRegions(ConnectionQueryServicesImpl.java:411)
>> >>> ... 14 more
>> >>>
>> >>> --
>> >>> Regards,
>> >>> Vikas Agarwal
>> >>> 91 – 9928301411
>> >>>
>> >>> InfoObjects, Inc.
>> >>> Execution Matters
>> >>> http://www.infoobjects.com
>> >>> 2041 Mission College Boulevard, #280
>> >>> Santa Clara, CA 95054
>> >>> +1 (408) 988-2000 Work
>> >>> +1 (408) 716-2726 Fax
>> >>
>> >>
>> >>
>> >> CONFIDENTIALITY NOTICE
>> >> NOTICE: This message is intended for the use of the individual or
>> >> entity
>> >> to which it is addressed and may contain information that is
>> >> confidential,
>> >> privileged and exempt from disclosure under applicable law. If the
>> >> reader of
>> >> this message is not the intended recipient, you are hereby notified
>> >> that any
>> >> printing, copying, dissemination, distribution, disclosure or
>> >> forwarding of
>> >> this communication is strictly prohibited. If you have received this
>> >> communication in error, please contact the sender immediately and
>> >> delete it
>> >> from your system. Thank You.
>> >
>> >
>> >
>> >
>> > --
>> > Regards,
>> > Vikas Agarwal
>> > 91 – 9928301411
>> >
>> > InfoObjects, Inc.
>> > Execution Matters
>> > http://www.infoobjects.com
>> > 2041 Mission College Boulevard, #280
>> > Santa Clara, CA 95054
>> > +1 (408) 988-2000 Work
>> > +1 (408) 716-2726 Fax
>
>
>
>
> --
> Regards,
> Vikas Agarwal
> 91 – 9928301411
>
> InfoObjects, Inc.
> Execution Matters
> http://www.infoobjects.com
> 2041 Mission College Boulevard, #280
> Santa Clara, CA 95054
> +1 (408) 988-2000 Work
> +1 (408) 716-2726 Fax

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: Issue in connecting with HBase using Hortonworks

Posted by Vikas Agarwal <vi...@infoobjects.com>.
I didn't see Phoenix as option anywhere in Ambari and I found this
http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.1.3/bk_installing_manually_book/content/rpm-chap-phoenix.html

Did I miss something in Ambari for Phoenix?


On Mon, Sep 1, 2014 at 11:01 PM, James Taylor <ja...@apache.org>
wrote:

> Hi Vikas,
> Glad you got it working. Just curious - why did you install Phoenix
> via yum when the HDP 2.1 already comes pre-installed with Phoenix?
> Thanks,
> James
>
> On Mon, Sep 1, 2014 at 10:16 AM, Vikas Agarwal <vi...@infoobjects.com>
> wrote:
> > Yes, I am using HDP 2.1 and installed Phoenix via yum and it installed
> 4.0.0
> > of phoenix. Added symlink to the phoenix-core*jar into /usr/lib/hbase/lib
> > and I did restart after these changes.
> >
> > However, now, I am able to connect with HBase. Somehow, SYSTEM.CATALOG
> was
> > got created in one of my earlier attempts to connect phoenix to HBase
> and it
> > didn't contained the required region server information for phoenix to
> work
> > on. So, I deleted this table and hope phoenix to recreate it and it did
> and
> > connection also worked after that. :)
> >
> >
> > On Mon, Sep 1, 2014 at 6:54 PM, Nicolas Maillard <
> nmaillard@hortonworks.com>
> > wrote:
> >>
> >> Hello
> >> Just to be sure you are using HDP 2.1 and you gotten the phoenix jars
> from
> >> the page you listed and have put the phoenix jar in all on the hbase
> nodes
> >> in the lib directory and restarted the whole hbase service.
> >> If so could you also paste the line you use to start sqlline.
> >>
> >>
> >> On Mon, Sep 1, 2014 at 12:43 PM, Vikas Agarwal <vi...@infoobjects.com>
> >> wrote:
> >>>
> >>> Hi,
> >>>
> >>> We have installed Hadoop cluster using Hortonworks distribution and
> >>> trying to connect the Phoenix with HBase. However, even after
> following the
> >>> steps mentioned here, we are not able to connect Phoenix to HBase.
> >>>
> >>> When I am trying to test the connection using sqlline.py, the command
> >>> hangs the control and I am even not able to do Ctrl-C or Ctrl-Z. After
> >>> sometime (around 10 min) it throws same exception as with psql.py,
> >>> complaining mismatch of phoenix jars.
> >>>
> >>> When I am trying to test the connection using Phoenix's psql.py
> command,
> >>> following set of exceptions are coming:
> >>>
> >>> 14/09/01 05:41:25 WARN util.NativeCodeLoader: Unable to load
> >>> native-hadoop library for your platform... using builtin-java classes
> where
> >>> applicable
> >>> 14/09/01 05:41:26 WARN util.DynamicClassLoader: Failed to identify the
> fs
> >>> of dir hdfs://hdp.ambari:8020/apps/hbase/data/lib, ignored
> >>> java.io.IOException: No FileSystem for scheme: hdfs
> >>> at
> >>>
> org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2385)
> >>> at
> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2392)
> >>> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
> >>> at
> >>> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431)
> >>> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413)
> >>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
> >>> at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
> >>> at
> >>>
> org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(DynamicClassLoader.java:104)
> >>> at
> >>>
> org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:217)
> >>> at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64)
> >>> at
> >>>
> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:75)
> >>> at
> >>>
> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:86)
> >>> at
> >>>
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:853)
> >>> at
> >>>
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:657)
> >>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
> >>> at
> >>>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> >>> at
> >>>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> >>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> >>> at
> >>>
> org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:419)
> >>> at
> >>>
> org.apache.hadoop.hbase.client.ConnectionManager.createConnectionInternal(ConnectionManager.java:314)
> >>> at
> >>>
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:291)
> >>> at
> >>>
> org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
> >>> at
> >>>
> org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:252)
> >>> at
> >>>
> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1447)
> >>> at
> >>>
> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
> >>> at
> >>>
> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
> >>> at java.sql.DriverManager.getConnection(DriverManager.java:571)
> >>> at java.sql.DriverManager.getConnection(DriverManager.java:187)
> >>> at org.apache.phoenix.util.PhoenixRuntime.main(PhoenixRuntime.java:197)
> >>>
> >>> java.sql.SQLException: ERROR 2006 (INT08): Incompatible jars detected
> >>> between client and server. Ensure that phoenix.jar is put on the
> classpath
> >>> of HBase in every region server: ERROR 1102 (XCL02): Cannot get all
> table
> >>> regions
> >>> at
> >>>
> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:309)
> >>> at
> >>>
> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:133)
> >>> at
> >>>
> org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:932)
> >>> at
> >>>
> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:831)
> >>> at
> >>>
> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1058)
> >>> at
> >>>
> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1156)
> >>> at
> >>>
> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
> >>> at
> >>>
> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
> >>> at
> >>>
> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:226)
> >>> at
> >>>
> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:908)
> >>> at
> >>>
> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1453)
> >>> at
> >>>
> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
> >>> at
> >>>
> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
> >>> at java.sql.DriverManager.getConnection(DriverManager.java:571)
> >>> at java.sql.DriverManager.getConnection(DriverManager.java:187)
> >>> at org.apache.phoenix.util.PhoenixRuntime.main(PhoenixRuntime.java:197)
> >>> Caused by: java.sql.SQLException: ERROR 1102 (XCL02): Cannot get all
> >>> table regions
> >>> at
> >>>
> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:309)
> >>> at
> >>>
> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:133)
> >>> at
> >>>
> org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllTableRegions(ConnectionQueryServicesImpl.java:425)
> >>> at
> >>>
> org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:887)
> >>> ... 13 more
> >>> Caused by: org.apache.hadoop.hbase.client.NoServerForRegionException:
> No
> >>> server address listed in hbase:meta for region
> >>> SYSTEM.CATALOG,,1408684006641.0d1ea455127dd4af6a806574b1f42a91.
> containing
> >>> row
> >>> at
> >>>
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1334)
> >>> at
> >>>
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1128)
> >>> at
> >>>
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.relocateRegion(ConnectionManager.java:1097)
> >>> at
> >>>
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.relocateRegion(ConnectionManager.java:1084)
> >>> at
> >>>
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getRegionLocation(ConnectionManager.java:904)
> >>> at
> >>>
> org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllTableRegions(ConnectionQueryServicesImpl.java:411)
> >>> ... 14 more
> >>>
> >>> --
> >>> Regards,
> >>> Vikas Agarwal
> >>> 91 – 9928301411
> >>>
> >>> InfoObjects, Inc.
> >>> Execution Matters
> >>> http://www.infoobjects.com
> >>> 2041 Mission College Boulevard, #280
> >>> Santa Clara, CA 95054
> >>> +1 (408) 988-2000 Work
> >>> +1 (408) 716-2726 Fax
> >>
> >>
> >>
> >> CONFIDENTIALITY NOTICE
> >> NOTICE: This message is intended for the use of the individual or entity
> >> to which it is addressed and may contain information that is
> confidential,
> >> privileged and exempt from disclosure under applicable law. If the
> reader of
> >> this message is not the intended recipient, you are hereby notified
> that any
> >> printing, copying, dissemination, distribution, disclosure or
> forwarding of
> >> this communication is strictly prohibited. If you have received this
> >> communication in error, please contact the sender immediately and
> delete it
> >> from your system. Thank You.
> >
> >
> >
> >
> > --
> > Regards,
> > Vikas Agarwal
> > 91 – 9928301411
> >
> > InfoObjects, Inc.
> > Execution Matters
> > http://www.infoobjects.com
> > 2041 Mission College Boulevard, #280
> > Santa Clara, CA 95054
> > +1 (408) 988-2000 Work
> > +1 (408) 716-2726 Fax
>



-- 
Regards,
Vikas Agarwal
91 – 9928301411

InfoObjects, Inc.
Execution Matters
http://www.infoobjects.com
2041 Mission College Boulevard, #280
Santa Clara, CA 95054
+1 (408) 988-2000 Work
+1 (408) 716-2726 Fax

Re: Issue in connecting with HBase using Hortonworks

Posted by James Taylor <ja...@apache.org>.
Hi Vikas,
Glad you got it working. Just curious - why did you install Phoenix
via yum when the HDP 2.1 already comes pre-installed with Phoenix?
Thanks,
James

On Mon, Sep 1, 2014 at 10:16 AM, Vikas Agarwal <vi...@infoobjects.com> wrote:
> Yes, I am using HDP 2.1 and installed Phoenix via yum and it installed 4.0.0
> of phoenix. Added symlink to the phoenix-core*jar into /usr/lib/hbase/lib
> and I did restart after these changes.
>
> However, now, I am able to connect with HBase. Somehow, SYSTEM.CATALOG was
> got created in one of my earlier attempts to connect phoenix to HBase and it
> didn't contained the required region server information for phoenix to work
> on. So, I deleted this table and hope phoenix to recreate it and it did and
> connection also worked after that. :)
>
>
> On Mon, Sep 1, 2014 at 6:54 PM, Nicolas Maillard <nm...@hortonworks.com>
> wrote:
>>
>> Hello
>> Just to be sure you are using HDP 2.1 and you gotten the phoenix jars from
>> the page you listed and have put the phoenix jar in all on the hbase nodes
>> in the lib directory and restarted the whole hbase service.
>> If so could you also paste the line you use to start sqlline.
>>
>>
>> On Mon, Sep 1, 2014 at 12:43 PM, Vikas Agarwal <vi...@infoobjects.com>
>> wrote:
>>>
>>> Hi,
>>>
>>> We have installed Hadoop cluster using Hortonworks distribution and
>>> trying to connect the Phoenix with HBase. However, even after following the
>>> steps mentioned here, we are not able to connect Phoenix to HBase.
>>>
>>> When I am trying to test the connection using sqlline.py, the command
>>> hangs the control and I am even not able to do Ctrl-C or Ctrl-Z. After
>>> sometime (around 10 min) it throws same exception as with psql.py,
>>> complaining mismatch of phoenix jars.
>>>
>>> When I am trying to test the connection using Phoenix's psql.py command,
>>> following set of exceptions are coming:
>>>
>>> 14/09/01 05:41:25 WARN util.NativeCodeLoader: Unable to load
>>> native-hadoop library for your platform... using builtin-java classes where
>>> applicable
>>> 14/09/01 05:41:26 WARN util.DynamicClassLoader: Failed to identify the fs
>>> of dir hdfs://hdp.ambari:8020/apps/hbase/data/lib, ignored
>>> java.io.IOException: No FileSystem for scheme: hdfs
>>> at
>>> org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2385)
>>> at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2392)
>>> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
>>> at
>>> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431)
>>> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413)
>>> at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
>>> at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
>>> at
>>> org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(DynamicClassLoader.java:104)
>>> at
>>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:217)
>>> at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64)
>>> at
>>> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:75)
>>> at
>>> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:86)
>>> at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:853)
>>> at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:657)
>>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>> at
>>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>>> at
>>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>>> at
>>> org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:419)
>>> at
>>> org.apache.hadoop.hbase.client.ConnectionManager.createConnectionInternal(ConnectionManager.java:314)
>>> at
>>> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:291)
>>> at
>>> org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
>>> at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:252)
>>> at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1447)
>>> at
>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>> at
>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>>> at java.sql.DriverManager.getConnection(DriverManager.java:571)
>>> at java.sql.DriverManager.getConnection(DriverManager.java:187)
>>> at org.apache.phoenix.util.PhoenixRuntime.main(PhoenixRuntime.java:197)
>>>
>>> java.sql.SQLException: ERROR 2006 (INT08): Incompatible jars detected
>>> between client and server. Ensure that phoenix.jar is put on the classpath
>>> of HBase in every region server: ERROR 1102 (XCL02): Cannot get all table
>>> regions
>>> at
>>> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:309)
>>> at
>>> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:133)
>>> at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:932)
>>> at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:831)
>>> at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1058)
>>> at
>>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1156)
>>> at
>>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>>> at
>>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>>> at
>>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:226)
>>> at
>>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:908)
>>> at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1453)
>>> at
>>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>> at
>>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>>> at java.sql.DriverManager.getConnection(DriverManager.java:571)
>>> at java.sql.DriverManager.getConnection(DriverManager.java:187)
>>> at org.apache.phoenix.util.PhoenixRuntime.main(PhoenixRuntime.java:197)
>>> Caused by: java.sql.SQLException: ERROR 1102 (XCL02): Cannot get all
>>> table regions
>>> at
>>> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:309)
>>> at
>>> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:133)
>>> at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllTableRegions(ConnectionQueryServicesImpl.java:425)
>>> at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:887)
>>> ... 13 more
>>> Caused by: org.apache.hadoop.hbase.client.NoServerForRegionException: No
>>> server address listed in hbase:meta for region
>>> SYSTEM.CATALOG,,1408684006641.0d1ea455127dd4af6a806574b1f42a91. containing
>>> row
>>> at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1334)
>>> at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1128)
>>> at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.relocateRegion(ConnectionManager.java:1097)
>>> at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.relocateRegion(ConnectionManager.java:1084)
>>> at
>>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getRegionLocation(ConnectionManager.java:904)
>>> at
>>> org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllTableRegions(ConnectionQueryServicesImpl.java:411)
>>> ... 14 more
>>>
>>> --
>>> Regards,
>>> Vikas Agarwal
>>> 91 – 9928301411
>>>
>>> InfoObjects, Inc.
>>> Execution Matters
>>> http://www.infoobjects.com
>>> 2041 Mission College Boulevard, #280
>>> Santa Clara, CA 95054
>>> +1 (408) 988-2000 Work
>>> +1 (408) 716-2726 Fax
>>
>>
>>
>> CONFIDENTIALITY NOTICE
>> NOTICE: This message is intended for the use of the individual or entity
>> to which it is addressed and may contain information that is confidential,
>> privileged and exempt from disclosure under applicable law. If the reader of
>> this message is not the intended recipient, you are hereby notified that any
>> printing, copying, dissemination, distribution, disclosure or forwarding of
>> this communication is strictly prohibited. If you have received this
>> communication in error, please contact the sender immediately and delete it
>> from your system. Thank You.
>
>
>
>
> --
> Regards,
> Vikas Agarwal
> 91 – 9928301411
>
> InfoObjects, Inc.
> Execution Matters
> http://www.infoobjects.com
> 2041 Mission College Boulevard, #280
> Santa Clara, CA 95054
> +1 (408) 988-2000 Work
> +1 (408) 716-2726 Fax

Re: Issue in connecting with HBase using Hortonworks

Posted by Vikas Agarwal <vi...@infoobjects.com>.
Yes, I am using HDP 2.1 and installed Phoenix via yum and it installed
4.0.0 of phoenix. Added symlink to the phoenix-core*jar into
/usr/lib/hbase/lib and I did restart after these changes.

However, now, I am able to connect with HBase. Somehow, SYSTEM.CATALOG was
got created in one of my earlier attempts to connect phoenix to HBase and
it didn't contained the required region server information for phoenix to
work on. So, I deleted this table and hope phoenix to recreate it and it
did and connection also worked after that. :)


On Mon, Sep 1, 2014 at 6:54 PM, Nicolas Maillard <nm...@hortonworks.com>
wrote:

> Hello
> Just to be sure you are using HDP 2.1 and you gotten the phoenix jars from
> the page you listed and have put the phoenix jar in all on the hbase nodes
> in the lib directory and restarted the whole hbase service.
> If so could you also paste the line you use to start sqlline.
>
>
> On Mon, Sep 1, 2014 at 12:43 PM, Vikas Agarwal <vi...@infoobjects.com>
> wrote:
>
>> Hi,
>>
>> We have installed Hadoop cluster using Hortonworks distribution and
>> trying to connect the Phoenix with HBase. However, even after following the
>> steps mentioned here
>> <http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.1.3/bk_installing_manually_book/content/rpm-chap-phoenix.html>,
>> we are not able to connect Phoenix to HBase.
>>
>> When I am trying to test the connection using sqlline.py, the command
>> hangs the control and I am even not able to do Ctrl-C or Ctrl-Z. After
>> sometime (around 10 min) it throws same exception as with psql.py,
>> complaining mismatch of phoenix jars.
>>
>> When I am trying to test the connection using Phoenix's psql.py command,
>> following set of exceptions are coming:
>>
>> 14/09/01 05:41:25 WARN util.NativeCodeLoader: Unable to load
>> native-hadoop library for your platform... using builtin-java classes where
>> applicable
>> 14/09/01 05:41:26 WARN util.DynamicClassLoader: Failed to identify the fs
>> of dir hdfs://hdp.ambari:8020/apps/hbase/data/lib, ignored
>> java.io.IOException: No FileSystem for scheme: hdfs
>> at
>> org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2385)
>>  at
>> org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2392)
>> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
>>  at
>> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431)
>> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413)
>>  at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
>> at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
>>  at
>> org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(DynamicClassLoader.java:104)
>> at
>> org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:217)
>>  at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64)
>> at
>> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:75)
>>  at
>> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:86)
>> at
>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:853)
>>  at
>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:657)
>> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>>  at
>> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>> at
>> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>>  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>> at
>> org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:419)
>>  at
>> org.apache.hadoop.hbase.client.ConnectionManager.createConnectionInternal(ConnectionManager.java:314)
>> at
>> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:291)
>>  at
>> org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
>> at
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:252)
>>  at
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1447)
>> at
>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>  at
>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>> at java.sql.DriverManager.getConnection(DriverManager.java:571)
>>  at java.sql.DriverManager.getConnection(DriverManager.java:187)
>> at org.apache.phoenix.util.PhoenixRuntime.main(PhoenixRuntime.java:197)
>>
>> java.sql.SQLException: ERROR 2006 (INT08): *Incompatible jars detected
>> between client and server. Ensure that phoenix.jar is put on the classpath
>> of HBase in every region server*: ERROR 1102 (XCL02): Cannot get all
>> table regions
>>  at
>> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:309)
>> at
>> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:133)
>>  at
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:932)
>> at
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:831)
>>  at
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1058)
>> at
>> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1156)
>>  at
>> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
>> at
>> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>>  at
>> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:226)
>> at
>> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:908)
>>  at
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1453)
>> at
>> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>>  at
>> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
>> at java.sql.DriverManager.getConnection(DriverManager.java:571)
>>  at java.sql.DriverManager.getConnection(DriverManager.java:187)
>> at org.apache.phoenix.util.PhoenixRuntime.main(PhoenixRuntime.java:197)
>> Caused by: java.sql.SQLException: ERROR 1102 (XCL02): Cannot get all
>> table regions
>> at
>> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:309)
>>  at
>> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:133)
>> at
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllTableRegions(ConnectionQueryServicesImpl.java:425)
>>  at
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:887)
>> ... 13 more
>> Caused by: org.apache.hadoop.hbase.client.NoServerForRegionException: No
>> server address listed in hbase:meta for region
>> SYSTEM.CATALOG,,1408684006641.0d1ea455127dd4af6a806574b1f42a91. containing
>> row
>> at
>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1334)
>>  at
>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1128)
>> at
>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.relocateRegion(ConnectionManager.java:1097)
>>  at
>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.relocateRegion(ConnectionManager.java:1084)
>> at
>> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getRegionLocation(ConnectionManager.java:904)
>>  at
>> org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllTableRegions(ConnectionQueryServicesImpl.java:411)
>> ... 14 more
>>
>> --
>> Regards,
>> Vikas Agarwal
>> 91 – 9928301411
>>
>> InfoObjects, Inc.
>> Execution Matters
>> http://www.infoobjects.com
>> 2041 Mission College Boulevard, #280
>> Santa Clara, CA 95054
>> +1 (408) 988-2000 Work
>> +1 (408) 716-2726 Fax
>>
>>
>
> CONFIDENTIALITY NOTICE
> NOTICE: This message is intended for the use of the individual or entity
> to which it is addressed and may contain information that is confidential,
> privileged and exempt from disclosure under applicable law. If the reader
> of this message is not the intended recipient, you are hereby notified that
> any printing, copying, dissemination, distribution, disclosure or
> forwarding of this communication is strictly prohibited. If you have
> received this communication in error, please contact the sender immediately
> and delete it from your system. Thank You.




-- 
Regards,
Vikas Agarwal
91 – 9928301411

InfoObjects, Inc.
Execution Matters
http://www.infoobjects.com
2041 Mission College Boulevard, #280
Santa Clara, CA 95054
+1 (408) 988-2000 Work
+1 (408) 716-2726 Fax

Re: Issue in connecting with HBase using Hortonworks

Posted by Nicolas Maillard <nm...@hortonworks.com>.
Hello
Just to be sure you are using HDP 2.1 and you gotten the phoenix jars from
the page you listed and have put the phoenix jar in all on the hbase nodes
in the lib directory and restarted the whole hbase service.
If so could you also paste the line you use to start sqlline.


On Mon, Sep 1, 2014 at 12:43 PM, Vikas Agarwal <vi...@infoobjects.com>
wrote:

> Hi,
>
> We have installed Hadoop cluster using Hortonworks distribution and trying
> to connect the Phoenix with HBase. However, even after following the steps
> mentioned here
> <http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.1.3/bk_installing_manually_book/content/rpm-chap-phoenix.html>,
> we are not able to connect Phoenix to HBase.
>
> When I am trying to test the connection using sqlline.py, the command
> hangs the control and I am even not able to do Ctrl-C or Ctrl-Z. After
> sometime (around 10 min) it throws same exception as with psql.py,
> complaining mismatch of phoenix jars.
>
> When I am trying to test the connection using Phoenix's psql.py command,
> following set of exceptions are coming:
>
> 14/09/01 05:41:25 WARN util.NativeCodeLoader: Unable to load native-hadoop
> library for your platform... using builtin-java classes where applicable
> 14/09/01 05:41:26 WARN util.DynamicClassLoader: Failed to identify the fs
> of dir hdfs://hdp.ambari:8020/apps/hbase/data/lib, ignored
> java.io.IOException: No FileSystem for scheme: hdfs
> at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2385)
>  at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2392)
> at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
>  at
> org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431)
> at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413)
>  at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
> at org.apache.hadoop.fs.Path.getFileSystem(Path.java:296)
>  at
> org.apache.hadoop.hbase.util.DynamicClassLoader.<init>(DynamicClassLoader.java:104)
> at
> org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:217)
>  at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64)
> at
> org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:75)
>  at
> org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:86)
> at
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:853)
>  at
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:657)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>  at
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>  at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at
> org.apache.hadoop.hbase.client.ConnectionManager.createConnection(ConnectionManager.java:419)
>  at
> org.apache.hadoop.hbase.client.ConnectionManager.createConnectionInternal(ConnectionManager.java:314)
> at
> org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:291)
>  at
> org.apache.phoenix.query.HConnectionFactory$HConnectionFactoryImpl.createConnection(HConnectionFactory.java:47)
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.openConnection(ConnectionQueryServicesImpl.java:252)
>  at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1447)
> at
> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>  at
> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
> at java.sql.DriverManager.getConnection(DriverManager.java:571)
>  at java.sql.DriverManager.getConnection(DriverManager.java:187)
> at org.apache.phoenix.util.PhoenixRuntime.main(PhoenixRuntime.java:197)
>
> java.sql.SQLException: ERROR 2006 (INT08): *Incompatible jars detected
> between client and server. Ensure that phoenix.jar is put on the classpath
> of HBase in every region server*: ERROR 1102 (XCL02): Cannot get all
> table regions
>  at
> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:309)
> at
> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:133)
>  at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:932)
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:831)
>  at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1058)
> at
> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:1156)
>  at
> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:422)
> at
> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)
>  at
> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:226)
> at
> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:908)
>  at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:1453)
> at
> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:131)
>  at
> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.connect(PhoenixEmbeddedDriver.java:112)
> at java.sql.DriverManager.getConnection(DriverManager.java:571)
>  at java.sql.DriverManager.getConnection(DriverManager.java:187)
> at org.apache.phoenix.util.PhoenixRuntime.main(PhoenixRuntime.java:197)
> Caused by: java.sql.SQLException: ERROR 1102 (XCL02): Cannot get all table
> regions
> at
> org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:309)
>  at
> org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:133)
> at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllTableRegions(ConnectionQueryServicesImpl.java:425)
>  at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:887)
> ... 13 more
> Caused by: org.apache.hadoop.hbase.client.NoServerForRegionException: No
> server address listed in hbase:meta for region
> SYSTEM.CATALOG,,1408684006641.0d1ea455127dd4af6a806574b1f42a91. containing
> row
> at
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1334)
>  at
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1128)
> at
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.relocateRegion(ConnectionManager.java:1097)
>  at
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.relocateRegion(ConnectionManager.java:1084)
> at
> org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getRegionLocation(ConnectionManager.java:904)
>  at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllTableRegions(ConnectionQueryServicesImpl.java:411)
> ... 14 more
>
> --
> Regards,
> Vikas Agarwal
> 91 – 9928301411
>
> InfoObjects, Inc.
> Execution Matters
> http://www.infoobjects.com
> 2041 Mission College Boulevard, #280
> Santa Clara, CA 95054
> +1 (408) 988-2000 Work
> +1 (408) 716-2726 Fax
>
>

-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.